Backends

Agentgateway backends Backend A destination service that receives traffic from agentgateway. Backends can be static hosts, MCP servers, LLM providers, or other services. control where traffic is routed to. Agentgateway supports a variety of backends, such as simple hostnames and IP addresses, LLM providers Provider A service that provides LLM capabilities, such as OpenAI, Anthropic, or Azure. Agentgateway supports multiple LLM providers and can route to different providers based on configuration. , and MCP servers.

Static Hosts

The simplest form of backend is a static hostname or IP address. For example:

# yaml-language-server: $schema=https://agentgateway.dev/schema/config
binds:
- port: 3000
  listeners:
  - protocol: HTTP
    routes:
    - backends:
      - host: example.com:8080
        weight: 1
      - host: 127.0.0.1:80
        weight: 9

MCP Servers

The MCP backend allows you to connect to an MCP server. Below shows a simple example, exposing a local and remote MCP server. See the MCP connectivity guide for more information.

backends:
- mcp:
    targets:
    - name: stdio-server
      stdio:
        cmd: npx        
        args: ["@modelcontextprotocol/server-everything"]
    - name: http-server
      mcp:
        host: https://example.com/mcp

Session routing

By default, MCP backends use stateful session routing, where the gateway tracks session IDs and routes subsequent requests to the same upstream. For OpenAPI and SSE upstreams that do not maintain server-side session state, you can set statefulMode: Stateless. In stateless mode, the gateway automatically wraps each request with an initialization sequence, so the upstream server processes every request independently.

backends:
- mcp:
    statefulMode: Stateless
    targets:
    - name: openapi-server
      openapi:
        schema:
          url: https://petstore3.swagger.io/api/v3/openapi.json

LLM Providers

Agentgateway natively supports connecting to LLM providers, such as OpenAI and Anthropic. Below shows a simple example, connecting to OpenAI. See the LLM consumption guide for more information.

backends:
- ai:
    provider:
      openAI:
        model: gpt-3.5-turbo
policies:
  backendAuth:
    key: "$OPENAI_API_KEY"
Agentgateway assistant

Ask me anything about agentgateway configuration, features, or usage.

Note: AI-generated content might contain errors; please verify and test all returned information.

Tip: one topic per conversation gives the best results. Use the + button in the chat header to start a new conversation.

Switching topics? Starting a new conversation improves accuracy.
↑↓ navigate select esc dismiss

What could be improved?

Your feedback helps us improve assistant answers and identify docs gaps we should fix.

Need more help? Join us on Discord: https://discord.gg/y9efgEmppm

Want to use your own agent? Add the Solo MCP server to query our docs directly. Get started here: https://search.solo.io/.