All Products
Search
Document Center

PolarDB:Getting started

Last Updated:Apr 21, 2025

Polar_AI is an AI extension provided by PolarDB that integrates advanced AI models and algorithms. It allows you to perform tasks such as machine learning and natural language processing in databases. This topic introduces the basic features of Polar_AI, including how to call AI models to perform embedding and sentiment classification in the database, and how to customize AI models using SQL statements to interact with other AI model services.

Supported versions

Your PolarDB for PostgreSQL (Compatible with Oracle) cluster runs PolarDB for PostgreSQL (Compatible with Oracle) 2.0 with revision version 2.0.14.28.0 or later.

Note

You can check the revision version of your cluster in the PolarDB console or by executing the SHOW polardb_version; statement. Upgrade the version if necessary.

Terms

  • Natural language processing (NLP): a field of AI that focuses on enabling computers to understand and generate human language. This field includes technologies such as text classification, sentiment analysis, machine translation, and dialogue systems.

  • Embedding: an important concept in the fields of machine learning and NLP. This concept refers to the process of converting high-dimensional, sparse feature vectors (such as words in a dictionary and image pixels) into vector representations in a low-dimensional, dense, and continuous vector space.

Benefits

You can call and manage AI models within the database using standard SQL statements. This approach offers the following advantages:

  • Ease of use: Without requiring deep AI expertise or complex programming skills, you only need to master basic SQL syntax to complete the entire process from model training to prediction inference output. This allows non-professionals to participate in AI application practices.

  • Flexibility and customization: In addition to common AI algorithms provided by Polar_AI, you can import new models based on business needs. You can implement functional extensions by writing just a few simple SQL statements. Therefore, different types of tasks such as text classification, image recognition, or time series forecasting can be efficiently processed under a unified framework.

  • Seamless data integration: Traditionally, outputs generated by AI models usually need to go through additional steps before they can be effectively integrated into existing information systems. However, in this solution, all AI computation results can be directly saved to the database and can be conveniently joined with other structured or unstructured data for querying and analysis, thus providing more comprehensive and accurate information support for decision making.

  • Data security: Throughout the computation process, raw data is always stored in a secure and reliable database environment, avoiding information leakage due to frequent data transmission. At the same time, with mature enterprise-level features such as fine-grained access control, audit trail, and encryption technology, the protection level of the system is further enhanced.

  • Excellent performance: Because all computation tasks are executed within the database, the overhead caused by data migration is reduced. The system performance in terms of response time and throughput is excellent. This is particularly important for scenarios that require high levels of real-time performance.

  • Enterprise-level service support: With the advanced features of PolarDB, such as automatic failover, online scaling and tiered storage, enterprise users can build stable and reliable large-scale data processing platforms.

Procedure

  1. Create the extension.

    Use a privileged account to execute the following statement:

    CREATE EXTENSION IF NOT EXISTS polar_ai;
  2. Prepare for using the AI_CreateModel function to create a custom AI model.

    • Deploy a model service and make sure it is available. In this example, deploy the DeepSeek-R1-Distill-Qwen-7B model on Platform for AI (PAI).

      Note

      The model service must be deployed in the same virtual private cloud (VPC) as your PolarDB cluster.

      After the deployment, go to Service details > Call Information. Click View Call Information to get the Access address and Token information.

    • Define the input function.

      See the following sample HTTP request body to call the DeepSeek-R1-Distill-Qwen-7B model:

      {
        "model": "DeepSeek-R1-Distill-Qwen-7B",
        "prompt": "Hello!"
      }

      In the request body, the parameters model and prompt are required, and the parameter max_tokens is optional. The input function can be defined as follows:

      CREATE OR REPLACE FUNCTION ai_text_in_fn(model_name text, content text)
          RETURNS jsonb
          LANGUAGE plpgsql
          AS $function$
          BEGIN
              RETURN ('{"model": "'|| model_name ||'","prompt":"'|| content ||'"}')::jsonb;
          END;
          $function$;
  3. Create an AI model.

    Call the AI_CreateModel function to create a model. The syntax of the AI_CreateModel function is as follows:

    table AI_CreateModel(text model_id, text model_url, text model_provider, text model_type, text model_name, json model_config, regprocedure model_headers_fn, regprocedure model_in_transform_fn, regprocedure model_out_transform_fn);

    In the SQL statement that creates the model, you need to configure the following parameters:

    • Set the model_url parameter to the access address that you get in step 2 with the call method appended. The Chat and Completions call methods are supported. In this example, the Completions method is used by appending /v1/completions.

    • Set the model_config parameter to the AI model service token that you get in step 2.

    • Set the model_in_transform_fn parameter to ai_text_in_fn, which is the input function name.

    Execute the following SQL statement to create an AI model:

    SELECT polar_ai.ai_createmodel('my_test_pai_model', '<access address>/v1/completions','Alibaba','EAS large language model','DeepSeek-R1-Distill-Qwen-7B','{"author_type": "token", "token": "<model service token>"}', NULL,'ai_text_in_fn'::regproc,NULL);
    Note

    To view the information about the created AI model, execute the SELECT * FROM polar_ai._ai_models; statement.

  4. Create a function to call the AI model.

    Use the AI_CallModel function to create a function to call the AI model:

    CREATE OR REPLACE FUNCTION my_text_pai_model_func(model_id text, content text)
        RETURNS text
        AS $$ select (polar_ai.AI_CALLMODEL($1,$2)::jsonb -> 'choices' -> 0 ->> 'text')::text AS result $$
        LANGUAGE 'sql' IMMUTABLE;
  5. Call the AI model by executing the following SQL statement.

    SELECT my_text_pai_model_func('my_test_pai_model', 'Hello');

    Sample output:

            my_text_pai_model_func        
    --------------------------------------
     Hello! How can I assist you today? 
    (1 row)