This guide walks you through setting up the AI Proxy plugin with Gemini.

For all providers, the Kong AI Proxy plugin attaches to route entities.

Prerequisites

Configure the AI Proxy plugin

  1. Create a service in Kong Gateway that will represent the Google Gemini API:

    1. curl -i -X POST http://localhost:8001/services \
    2. --data "name=gemini-service" \
    3. --data "url=https://generativelanguage.googleapis.com"
  2. Create a route that maps to the service you defined:

    1. curl -i -X POST http://localhost:8001/routes \
    2. --data "paths[]=/gemini" \
    3. --data "service.id=$(curl -s http://localhost:8001/services/gemini-service | jq -r '.id')"
  3. Use the Kong Admin API to configure the AI Proxy Plugin to route requests to Google Gemini:

    1. curl -i -X POST http://localhost:8001/services/gemini-service/plugins \
    2. --data 'name=ai-proxy' \
    3. --data 'config.auth.param_name=key' \
    4. --data 'config.auth.param_value=<GEMINI_API_TOKEN>' \
    5. --data 'config.auth.param_location=query' \
    6. --data 'config.route_type=llm/v1/chat' \
    7. --data 'config.model.provider=gemini' \
    8. --data 'config.model.name=gemini-1.5-flash'

Be sure to replace GEMINI_API_TOKEN with your API token.

Test the configuration

Make an llm/v1/chat type request to test your new endpoint:

  1. curl -X POST http://localhost:8000/gemini \
  2. -H 'Content-Type: application/json' \
  3. --data-raw '{ "messages": [ { "role": "system", "content": "You are a mathematician" }, { "role": "user", "content": "What is 1+1?"} ] }'

Previous Set up AI Proxy with Bedrock

Next AI Proxy Changelog