GATEKEEPER
G
Gatekeeper
Quick Start

Getting Started with Gatekeeper

Deploy Gatekeeper with Docker Compose, add your provider API keys, and make your first request to 290+ models through a single endpoint — in under five minutes.

Prerequisites

  • Docker and Docker Compose installed
  • At least one AI provider API key (OpenAI, Anthropic, etc.)
  • Port 4000 available (or configure a different port)

Deploy with Docker

1

Run the Docker container

The fastest way to start Gatekeeper is with a single Docker run command. This starts the proxy on port 4000 with an in-memory config — good for testing.

bash
docker run -d \
  -p 4000:4000 \
  -e OPENAI_API_KEY=sk-... \
  -e ANTHROPIC_API_KEY=sk-ant-... \
  --name gatekeeper \
  ghcr.io/gatekeeper-dev/gatekeeper:latest

For production, use Docker Compose with a PostgreSQL database for persistent config and usage data. See the self-hosting guide.

2

Verify the proxy is running

Check that Gatekeeper is up and can reach your configured providers:

bash
curl http://localhost:4000/health

{
  "status": "ok",
  "providers": {
    "openai": "reachable",
    "anthropic": "reachable"
  },
  "version": "1.4.2"
}
3

Add a virtual API key

Create a virtual key to hand to your application. This key routes to your real provider keys and can have budget limits, model restrictions, and usage tracking attached.

Create a key via API
curl -X POST http://localhost:4000/v1/keys \
  -H "Authorization: Bearer sk-gk-master-key" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "my-app",
    "models": ["gpt-4o", "claude-3-5-sonnet-20241022"],
    "budget_limit": 50.00,
    "budget_period": "monthly"
  }'
Response
{
  "key": "sk-gk-myapp-xxxxxxxxxxxx",
  "name": "my-app",
  "budget_limit": 50.00,
  "budget_remaining": 50.00,
  "models": ["gpt-4o", "claude-3-5-sonnet-20241022"]
}
Never embed your master key in application code. Use virtual keys — they can be rotated or revoked without touching your provider credentials.
4

Make your first request

Replace your OpenAI or Anthropic base URL with Gatekeeper's. Your existing SDK code works without any other changes.

Python (OpenAI SDK)
from openai import OpenAI

client = OpenAI(
    api_key="sk-gk-myapp-xxxxxxxxxxxx",  # Your virtual key
    base_url="http://localhost:4000/v1"  # Point to Gatekeeper
)

response = client.chat.completions.create(
    model="gpt-4o",          # Or: "claude-3-5-sonnet-20241022"
    messages=[{"role": "user", "content": "Hello!"}]
)

print(response.choices[0].message.content)
Direct cURL
curl http://localhost:4000/v1/chat/completions \
  -H "Authorization: Bearer sk-gk-myapp-xxxxxxxxxxxx" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gpt-4o",
    "messages": [{"role": "user", "content": "Hello!"}]
  }'