Supported Chat Completion Providers

Our platform supports multiple AI providers through a unified proxy interface.
You can use base_url to route requests to the correct provider via your proxy.

Supported Providers

ProvidercurlOpenAI Python SDKAnthropic Python SDKBase URL Path
OpenAI/openai/v1
Anthropic/anthropic/v1, /anthropic/v1/messages
xAI/xai/v1

General pattern:

{PROXY_URL}/{provider_name}/{provider_base__url_path}

Where:

  • PROXY_URL is your projects proxy endpoint (e.g., https://<your_project_id>.proxy.tokenduck.xyx)
  • provider_name is one of: openai, anthropic, xai
  • provider_version matches the provider’s own API path (e.g., v1)

OpenAI

Using curl

curl `https://<your_project_id>.proxy.tokenduck.xyx/openai/v1/chat/completions \
  -H "Authorization: Bearer OPENAI_API_KEY" \
  -H "X-Proxy-API-Key: PROXY_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gpt-4o-mini",
    "messages": [{"role": "user", "content": "Hello from TokenDuck"}]
  }'

Using OpenAI Python SDK

from openai import OpenAI

client = OpenAI(
    api_key="OPENAI_API_KEY",
    base_url="https://<your_project_id>.proxy.tokenduck.xyx/openai/v1",
    default_headers={"X-Proxy-API-Key": "PROXY_API_KEY"},
)

resp = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "Hello from TokenDuck"}],
)
print(resp)

Anthropic

Using curl

curl `https://<your_project_id>.proxy.tokenduck.xyx/anthropic/v1/messages \
  -H "x-api-key: ANTHROPIC_API_KEY" \
  -H "X-Proxy-API-Key: PROXY_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "claude-3-sonnet-20240229",
    "messages": [{"role": "user", "content": "Hello from TokenDuck"}]
  }'

Using Anthropic Python SDK

from anthropic import Anthropic

client = Anthropic(
    api_key="ANTHROPIC_API_KEY",
    base_url="https://<your_project_id>.proxy.tokenduck.xyx/anthropic/v1/messages",
    default_headers={"X-Proxy-API-Key": "PROXY_API_KEY"},
)

resp = client.messages.create(
    model="claude-3-sonnet-20240229",
    messages=[{"role": "user", "content": "Hello from TokenDuck"}],
)
print(resp)

Using OpenAI Python SDK (Anthropic through OpenAI interface)

from openai import OpenAI
from urllib.parse import urljoin

client = OpenAI(
    api_key="ANTHROPIC_API_KEY",
    base_url="https://<your_project_id>.proxy.tokenduck.xyx/anthropic/v1",
    default_headers={"X-Proxy-API-Key": "PROXY_API_KEY"},
)

resp = client.chat.completions.create(
    model="claude-3-sonnet-20240229",
    messages=[{"role": "user", "content": "Hello from TokenDuck"}],
)
print(resp)

xAI

Using curl

curl `https://<your_project_id>.proxy.tokenduck.xyx/xai/v1/chat/completions \
  -H "Authorization: Bearer XAI_API_KEY" \
  -H "X-Proxy-API-Key: PROXY_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "grok-beta",
    "messages": [{"role": "user", "content": "Hello from TokenDuck"}]
  }'

Using OpenAI Python SDK

from openai import OpenAI

client = OpenAI(
    api_key="XAI_API_KEY",
    base_url="https://<your_project_id>.proxy.tokenduck.xyx/xai/v1",
    default_headers={"X-Proxy-API-Key": "PROXY_API_KEY"},
)

resp = client.chat.completions.create(
    model="grok-beta",
    messages=[{"role": "user", "content": "Hello from TokenDuck"}],
)
print(resp)