This guide walks you through setting up the AI Proxy plugin with Anthropic.
For all providers, the Kong AI Proxy plugin attaches to route entities.
It can be installed into one route per operation, for example:
- OpenAI
chat
route - Cohere
chat
route - Cohere
completions
route
Each of these AI-enabled routes must point to a null service. This service doesn’t need to map to any real upstream URL, it can point somewhere empty (for example, http://localhost:32000
), because the AI Proxy plugin overwrites the upstream URL. This requirement will be removed in a later Kong revision.
Prerequisites
- Anthropic account and subscription
- You need a service to contain the route for the LLM provider. Create a service first:
curl -X POST http://localhost:8001/services \
--data "name=ai-proxy" \
--data "url=http://localhost:32000"
Remember that the upstream URL can point anywhere empty, as it won’t be used by the plugin.
Provider configuration
After creating an Anthropic account and purchasing a subscription, you can then create an AI Proxy route and plugin configuration.
Set up route and plugin
Kong Admin API
YAML
Create the route:
curl -X POST http://localhost:8001/services/ai-proxy/routes \
--data "name=anthropic-chat" \
--data "paths[]=~/anthropic-chat$"
Enable and configure the AI Proxy plugin for Anthropic, replacing the <anthropic_key>
with your own API key:
curl -X POST http://localhost:8001/routes/anthropic-chat/plugins \
--data "name=ai-proxy" \
--data "config.route_type=llm/v1/chat" \
--data "config.auth.header_name=apikey" \
--data "config.auth.header_value=<anthropic_key>" \
--data "config.model.provider=anthropic" \
--data "config.model.name=claude-2.1" \
--data "config.model.options.max_tokens=512" \
--data "config.model.options.temperature=1.0" \
--data "config.model.options.top_p=256" \
--data "config.model.options.top_k=0.5"
name: anthropic-chat
paths:
- "~/anthropic-chat$"
methods:
- POST
plugins:
- name: ai-proxy
config:
route_type: "llm/v1/chat"
auth:
header_name: "apikey"
header_value: "<anthropic_key>" # add your own Anthropic API key
model:
provider: "anthropic"
name: "claude-2.1"
options:
max_tokens: 512
temperature: 1.0
top_p: 256
top_k: 0.5
Test the configuration
Make an llm/v1/chat
type request to test your new endpoint:
curl -X POST http://localhost:8000/anthropic-chat \
-H 'Content-Type: application/json' \
--data-raw '{ "messages": [ { "role": "system", "content": "You are a mathematician" }, { "role": "user", "content": "What is 1+1?"} ] }'
Previous Configure Streaming with AI Proxy