Skip to main content
Version: Next

ai-prompt-template

Description#

The ai-prompt-template plugin simplifies access to LLM providers, such as OpenAI and Anthropic, and their models by predefining the request format using a template, which only allows users to pass customized values into template variables.

Plugin Attributes#

FieldRequiredTypeDescription
templatesYesArrayAn array of template objects
templates.nameYesStringName of the template.
templates.template.modelYesStringModel of the AI Model, for example gpt-4 or gpt-3.5. See your LLM provider API documentation for more available models.
templates.template.messages.roleYesStringRole of the message (system, user, assistant)
templates.template.messages.contentYesStringContent of the message.

Example usage#

Create a route with the ai-prompt-template plugin like so:

curl "http://127.0.0.1:9180/apisix/admin/routes/1" -X PUT \
-H "X-API-KEY: ${ADMIN_API_KEY}" \
-d '{
"uri": "/v1/chat/completions",
"upstream": {
"type": "roundrobin",
"nodes": {
"api.openai.com:443": 1
},
"scheme": "https",
"pass_host": "node"
},
"plugins": {
"ai-prompt-template": {
"templates": [
{
"name": "level of detail",
"template": {
"model": "gpt-4",
"messages": [
{
"role": "user",
"content": "Explain about {{ topic }} in {{ level }}."
}
]
}
}
]
}
}
}'

Now send a request:

curl http://127.0.0.1:9080/v1/chat/completions -i -XPOST  -H 'Content-Type: application/json' -d '{
"template_name": "level of detail",
"topic": "psychology",
"level": "brief"
}' -H "Authorization: Bearer <your token here>"

Then the request body will be modified to something like this:

{
"model": "some model",
"messages": [
{ "role": "user", "content": "Explain about psychology in brief." }
]
}