Creates an AI model and registers it in a metadata table for invocation.
Syntax
table AI_CreateModel(text model_id, text model_url, text model_provider, text model_type, text model_name, json model_config, regprocedure model_headers_fn, regprocedure model_in_transform_fn, regprocedure model_out_transform_fn);Parameters
Parameter | Description |
model_id | The unique identifier of the AI model. The value
must be different from that of the Note The identifier of the AI model cannot start with an underscore (_). When you create the Polar_AI extension, the system automatically creates a set of built-in AI models that start with an underscore (_). To view the AI models, execute the |
model_url | The URL used to access the AI model. The value cannot be NULL. Supported protocols include HTTP, HTTPS, and FILE. For example, the URL can be the endpoint included in the call information of the model that is deployed on Platform for AI (PAI). |
model_provider | The provider of the AI model. The value can be NULL. Examples: AWS, Alibaba, Baidu, and Tencent. |
model_type | The type of the AI model. The value can be NULL. Examples: LSTM and GRU. |
model_name | The name of the AI model. The value cannot be NULL. Example: text-embedding-v2. |
model_config | The configuration information of the AI model. Specify the value in the JSON format. The value cannot be NULL. The format is: { "author_type":"token", "token":"<YOUR_API_KEY>" }.
|
model_headers_fn | The function used to construct the request headers. The returned value is in the JSONB format. If the model has no special header requirements, you do not need to configure this parameter. Default value: NULL. |
model_in_transform_fn | The function used to convert the input data to a valid format. The value cannot be NULL. |
model_out_transform_fn | The function used to parse and transform the output data returned by the AI model. The value cannot be NULL. |
Return values
Returns the AI model creation result in tabular form. The following table describes the result parameters.
Parameter | Description |
model_seq | The sequence number of the AI model. |
model_schema | The workspace to which the AI model belongs. |
model_id | The unique identifier of the AI model. |
model_qname | The name of the AI model. |
created | Indicates whether the AI model is created.
|
Input and output functions
AI_CreateModelregisters the metadata of an AI model in a metadata table but does not call or execute the model. For information about how to call an AI model, see AI_CallModel.The input function specified by the model_in_transform_fn parameter converts user-defined input into a JSONB-formatted HTTP request body (required by the AI model). The following example shows the output of the input function for the
DeepSeek-R1-Distill-Qwen-7Bmodel deployed on PAI:{ "model": "DeepSeek-R1-Distill-Qwen-7B", "prompt": "Hello!" }The request body must contain the
modelandpromptparameters. You can define the input function as follows:CREATE OR REPLACE FUNCTION ai_text_in_fn(model_name text, content text) RETURNS jsonb LANGUAGE plpgsql AS $function$ BEGIN RETURN ('{"model": "'|| model_name ||'","prompt":"'|| content ||'"}')::jsonb; END; $function$;The output function specified by the model_out_transform_fn parameter converts the mode' s output, which is usually in the JSON format, into the required format. The following sample code shows the actual result in the JSON format returned by a known AI model:
{ "id": "8e44xxxx", "object": "text_completion", "created": 1744355891, "model": "DeepSeek-R1-Distill-Qwen-7B", "choices": [ { "index": 0, "text": " I have a\n\n\n</think>\n\nHello! How can I assist you today? ", "logprobs": null, "finish_reason": "stop", "matched_stop": 151643 } ], "usage": { "prompt_tokens": 3, "total_tokens": 21, "completion_tokens": 18, "prompt_tokens_details": null } }Only the
choices[0].textpart from the returned result is required. The parameter must be of the JSONB type. You can define the output function as follows:CREATE OR REPLACE FUNCTION ai_text_out_fn(model_id text, response_json jsonb) RETURNS jsonb AS $$ SELECT (response_json->'choices'->0->'text')::jsonb $$ LANGUAGE sql IMMUTABLE;
Examples
The following procedure uses the DeepSeek-R1-Distill-Qwen-7B model deployed on PAI as an example to describe how to register a model.
Create an input function for the model.
CREATE OR REPLACE FUNCTION ai_text_in_fn(model_name text, content text) RETURNS jsonb LANGUAGE plpgsql AS $function$ BEGIN RETURN ('{"model": "'|| model_name ||'","prompt":"'|| content ||'"}')::jsonb; END; $function$;Create an output function for the model.
CREATE OR REPLACE FUNCTION ai_text_out_fn(model_id text, response_json jsonb) RETURNS jsonb AS $$ SELECT (response_json->'choices'->0->'text')::jsonb $$ LANGUAGE sql IMMUTABLE;Register the metadata of the model in a metadata table. Complete the following configurations:
Set
model_urlto the VPC endpoint of the model deployed on PAI with the invocation method/v1/completionsappended.Set
tokeninmodel_configto the token required to access the AI model.Set
model_in_transform_fnto theai_text_in_fnfunction.Set
model_out_transform_fnto theai_text_out_fnfunction.
SELECT polar_ai.AI_CreateModel('my_text_pai_model', '<EAS Endpoint>/v1/completions','Alibaba','EAS LLM','DeepSeek-R1-Distill-Qwen-32B','{"author_type": "token", "token": "<EAS Token>"}', NULL, 'ai_text_in_fn'::regproc, 'ai_text_out_fn'::regproc);View the result. For more information, see Return values.
ai_createmodel ------------------------------------------------------------- (1,my_text_pai_model,polar_ai,DeepSeek-R1-Distill-Qwen-7B,t)(Optional) View the created AI model.
SELECT * from polar_ai._ai_models;Sample result:
model_seq | model_id | model_url | model_provider | model_type | model_name | model_config | model_headers_fn | model_in_transform_fn | model_out_transform_fn --- 1 | my_text_embedding_model | https://dashscope.aliyuncs.com/api/v1/services/embeddings/text-embedding/text-embedding | Alibaba | General-purpose text embedding model | text-embedding-v2 | {"token": "20E0BE9E5438xxxxxxxxxxxxxxxxxxxx", "author_type": "token"} | - | ai_text_embedding_in_fn(text,text) | ai_text_embedding_out_fn(text,jsonb)