All Products
Search
Document Center

Microservices Engine:AI prompt template

Last Updated:Mar 11, 2026

The ai-prompt-template plug-in lets you define reusable, parameterized prompt templates for large language model (LLM) requests. Instead of embedding full prompts in every API call, clients reference a named template and supply only the variable values. This reduces request payload size, enforces consistent prompt patterns across teams, and lets you expose a general-purpose LLM as a purpose-built API -- for example, a code-generation bot or a translation service.

How it works

  1. You define one or more prompt templates in the plug-in configuration. Each template specifies the model, message roles, and message content, with {{variable}} placeholders for dynamic values.

  2. A client sends a request that contains the template name and the variable values.

  3. The plug-in substitutes the variables into the template and forwards the fully constructed prompt to the LLM.

Request transformation example

The following example shows how the plug-in transforms a lightweight client request into a complete LLM prompt.

Template configuration:

templates:
- name: "developer-chat"
  template:
    model: gpt-3.5-turbo
    messages:
    - role: system
      content: "You are a {{program}} expert, in {{language}} programming language."
    - role: user
      content: "Write me a {{program}} program."

Client request:

{
  "template": "developer-chat",
  "properties": {
    "program": "quick sort",
    "language": "python"
  }
}

Resulting LLM request after variable substitution:

{
  "model": "gpt-3.5-turbo",
  "messages": [
    {
      "role": "system",
      "content": "You are a quick sort expert, in python programming language."
    },
    {
      "role": "user",
      "content": "Write me a quick sort program."
    }
  ]
}

Running attributes

AttributeValue
Execution stagedefault stage
Execution priority500

Configuration

ParameterTypeRequiredDefaultDescription
templatesarray of objectYes-One or more prompt templates. See Template object parameters.

Template object parameters

ParameterTypeRequiredDefaultDescription
namestringYes-A unique identifier for the template. Clients reference this value in the template field of the request body.
template.modelstringYes-The LLM model to use, for example gpt-3.5-turbo.
template.messagesarray of objectYes-The ordered sequence of messages sent to the LLM. See Message object parameters.

Message object parameters

ParameterTypeRequiredDefaultDescription
rolestringYes-The message role, for example system or user.
contentstringYes-The message text. Use {{variable}} syntax to insert placeholders that are replaced with values from the client request at runtime.