Docs Standalone Kubernetes Blog Enterprise Community Get Started GitHub

LLM Providers

Agent Gateway supports multiple LLM providers, allowing you to route requests to different AI models and manage API keys centrally.

Quick start

To use an LLM provider with Agent Gateway, configure an ai backend:

binds:
- port: 3000
  listeners:
  - routes:
    - backends:
      - ai:
          name: my-llm
          provider:
            openAI:
              model: gpt-4o-mini
      policies:
        backendAuth:
          key: "$OPENAI_API_KEY"

See LLM Consumption for complete documentation on working with LLM providers.

Agentgateway assistant

Ask me anything about agentgateway configuration, features, or usage.

Note: AI-generated content might contain errors; please verify and test all returned information.

↑↓ navigate select esc dismiss