title: ai-prompt-template keywords:
The ai-prompt-template
plugin simplifies access to LLM providers, such as OpenAI and Anthropic, and their models by predefining the request format using a template, which only allows users to pass customized values into template variables.
Field | Required | Type | Description |
---|---|---|---|
templates | Yes | Array | An array of template objects |
templates.name | Yes | String | Name of the template. |
templates.template.model | Yes | String | Model of the AI Model, for example gpt-4 or gpt-3.5 . See your LLM provider API documentation for more available models. |
templates.template.messages.role | Yes | String | Role of the message (system , user , assistant ) |
templates.template.messages.content | Yes | String | Content of the message. |
Create a route with the ai-prompt-template
plugin like so:
curl "http://127.0.0.1:9180/apisix/admin/routes/1" -X PUT \ -H "X-API-KEY: ${ADMIN_API_KEY}" \ -d '{ "uri": "/v1/chat/completions", "upstream": { "type": "roundrobin", "nodes": { "api.openai.com:443": 1 }, "scheme": "https", "pass_host": "node" }, "plugins": { "ai-prompt-template": { "templates": [ { "name": "level of detail", "template": { "model": "gpt-4", "messages": [ { "role": "user", "content": "Explain about {{ topic }} in {{ level }}." } ] } } ] } } }'
Now send a request:
curl http://127.0.0.1:9080/v1/chat/completions -i -XPOST -H 'Content-Type: application/json' -d '{ "template_name": "level of detail", "topic": "psychology", "level": "brief" }' -H "Authorization: Bearer <your token here>"
Then the request body will be modified to something like this:
{ "model": "some model", "messages": [ { "role": "user", "content": "Explain about psychology in brief." } ] }