Skip to content

AI Providers

oAI-Web supports three AI providers: Anthropic, OpenRouter, and OpenAI. All are optional — configure at least one.


Architecture

All providers implement the AIProvider abstract base class in server/providers/base.py. The agent loop always calls chat_async() and receives a normalised ProviderResponse.

Tool schemas are in Anthropic-native format internally. Each provider translates them to its own wire format.

Message format

Internally, messages use a superset of the OpenAI conversation format. Providers translate to their wire format on every call.


Anthropic

SDK: anthropic (native Python SDK)

Setup

  1. Get an API key from console.anthropic.com
  2. In oAI-Web: Settings → Credentials → add system:anthropic_api_key

Default model

claude-3-5-sonnet-20241022 (hardcoded fallback; override with DEFAULT_CHAT_MODEL in .env or model picker)

Available models

Hardcoded list in server/providers/models.py — update this file to add new Claude models as they're released.

Current models: - claude-opus-4-6 - claude-sonnet-4-6 - claude-haiku-4-5-20251001 - claude-3-5-sonnet-20241022 - claude-3-5-haiku-20241022 - claude-3-opus-20240229


OpenRouter

SDK: openai (OpenAI-compatible API at https://openrouter.ai/api/v1)

OpenRouter gives access to hundreds of models from various providers (OpenAI, Anthropic, Meta, Google, etc.) through a single API.

Setup

  1. Get an API key from openrouter.ai
  2. In oAI-Web: Settings → Credentials → add system:openrouter_api_key

Headers

oAI-Web sends these headers to OpenRouter for attribution:

X-Title: oAI-Web
HTTP-Referer: https://mac.oai.pm

Model discovery

OpenRouter models are fetched from https://openrouter.ai/api/v1/models and cached for 1 hour. The full model list is available in the Models page (/models).

Capability detection

  • Vision: checks architecture.input_modalities (list) or architecture.modality (string)
  • Tools: checks supported_generation_parameters or supported_parameters
  • Online: model ID contains "online", name contains "online", or ID starts with "perplexity/"

OpenAI

SDK: openai (native Python SDK)

Setup

  1. Get an API key from platform.openai.com
  2. In oAI-Web: Settings → Credentials → add system:openai_api_key

Available models

Discovery not yet implemented for OpenAI — models are not listed in the model picker. Specify the model directly in agent configuration (e.g. openai:gpt-4o).


Provider selection

Default provider

Set via Settings → General → Default Provider dropdown (stored as system:default_provider).

Priority order: 1. system:default_provider credential 2. DEFAULT_PROVIDER in .env 3. Hardcoded: "anthropic"

Per-turn model override

In the chat UI, the model picker lets you select any model from any provider. The selection is sent with each message as "provider:model" (e.g. "anthropic:claude-sonnet-4-6" or "openrouter:openai/gpt-4o").

Per-user API keys

Each user can store their own API keys in Settings → API Keys. These take priority over the system-wide keys stored in the admin credentials store.

Priority order for API key lookup: 1. Per-user {provider}_api_key in user_settings 2. Global system:{provider}_api_key in credentials

Model string format

The "provider:model" format is used throughout:

anthropic:claude-sonnet-4-6
openrouter:openai/gpt-4o
openai:gpt-4o

If no provider prefix is given, the default provider is used.


Adding a provider

  1. Create server/providers/my_provider.py implementing AIProvider
  2. Add the provider name to _known in registry.py::get_provider_for_model()
  3. Add a get_provider_for_name() branch in registry.py
  4. Add a branch in get_available_providers() and _resolve_key()
  5. Add model discovery logic in models.py if the provider has a model list API