The ai-prompt-template plug-in lets you define reusable, parameterized prompt templates for large language model (LLM) requests. Instead of embedding full prompts in every API call, clients reference a named template and supply only the variable values. This reduces request payload size, enforces consistent prompt patterns across teams, and lets you expose a general-purpose LLM as a purpose-built API -- for example, a code-generation bot or a translation service.
How it works
You define one or more prompt templates in the plug-in configuration. Each template specifies the model, message roles, and message content, with
{{variable}}placeholders for dynamic values.A client sends a request that contains the template name and the variable values.
The plug-in substitutes the variables into the template and forwards the fully constructed prompt to the LLM.
Request transformation example
The following example shows how the plug-in transforms a lightweight client request into a complete LLM prompt.
Template configuration:
templates:
- name: "developer-chat"
template:
model: gpt-3.5-turbo
messages:
- role: system
content: "You are a {{program}} expert, in {{language}} programming language."
- role: user
content: "Write me a {{program}} program."Client request:
{
"template": "developer-chat",
"properties": {
"program": "quick sort",
"language": "python"
}
}Resulting LLM request after variable substitution:
{
"model": "gpt-3.5-turbo",
"messages": [
{
"role": "system",
"content": "You are a quick sort expert, in python programming language."
},
{
"role": "user",
"content": "Write me a quick sort program."
}
]
}Running attributes
| Attribute | Value |
|---|---|
| Execution stage | default stage |
| Execution priority | 500 |
Configuration
| Parameter | Type | Required | Default | Description |
|---|---|---|---|---|
templates | array of object | Yes | - | One or more prompt templates. See Template object parameters. |
Template object parameters
| Parameter | Type | Required | Default | Description |
|---|---|---|---|---|
name | string | Yes | - | A unique identifier for the template. Clients reference this value in the template field of the request body. |
template.model | string | Yes | - | The LLM model to use, for example gpt-3.5-turbo. |
template.messages | array of object | Yes | - | The ordered sequence of messages sent to the LLM. See Message object parameters. |
Message object parameters
| Parameter | Type | Required | Default | Description |
|---|---|---|---|---|
role | string | Yes | - | The message role, for example system or user. |
content | string | Yes | - | The message text. Use {{variable}} syntax to insert placeholders that are replaced with values from the client request at runtime. |