Skip to content

Configuration

Clawforce supports two levels of provider configuration:

  • Global providers are created in Settings and available to every instance. This is the most common setup.
  • Instance providers are created on an individual instance’s detail page and apply only to that instance. Use these when an instance needs its own API key or a provider that other instances should not access.

Both types work identically with the LLM gateway. Instance providers appear alongside global providers when you enable models on that instance.

Go to Settings and find the Model API Keys section. Click Add Provider. A modal opens with two paths:

From the catalog — Select a known provider (Anthropic, OpenAI, Moonshot AI, and others) from the dropdown. Clawforce pre-fills the name and fetches the model list automatically from the Clawforce hosted catalog. You only need to supply your API key for that provider.

Custom — Select Custom from the dropdown to configure any provider manually, including self-hosted models (Ollama, vLLM, LM Studio) or any service with an OpenAI-compatible API. Fill in:

FieldDescription
NameDisplay name shown in the dashboard
Base URLProvider API base URL — see API types for the correct format per provider
API typeRequest format the provider uses — controls auth headers and URL handling.
API keyProvider credential. Stored encrypted and shown masked (**** + last 4 characters) after saving.
ModelsFor custom providers: define each model with its ID, name, and optional metadata (context window, max tokens, cost rates)

The provider key — a URL-safe slug like anthropic or my-ollama — is derived automatically from the name when you create the provider. It cannot be changed after creation.

The api_type field controls which authentication header the gateway forwards and how it handles the base URL. Choose the type that matches your provider’s API format.

API typeTypical providersAuth sent upstreamBase URL format
openai-completionsOpenAI, Together AI, OpenRouter, and any OpenAI-compatible APIAuthorization: BearerMust end with /v1 — e.g. https://api.openai.com/v1
openai-responsesOpenAI Responses APIAuthorization: BearerMust end with /v1
anthropic-messagesAnthropicx-api-keyRoot URL only — e.g. https://api.anthropic.com (no /v1)
google-generative-aiGoogle Geminix-goog-api-keyInclude /v1beta — e.g. https://generativelanguage.googleapis.com/v1beta
ollamaOllama (self-hosted)Authorization: Bearer (empty key accepted)Root URL — e.g. http://localhost:11434
  • anthropic-messages: set the root URL without /v1 (e.g. https://api.anthropic.com). The Anthropic SDK appends /v1/messages itself.
  • google-generative-ai: include /v1beta in the URL. The Google SDK omits the version segment when a custom base URL is provided, so you must include it explicitly.
  • openai-completions / openai-responses: the URL must end with /v1. The gateway de-duplicates the segment if it is already present.
  • ollama: an empty or missing API key is accepted — Ollama does not require authentication by default.

Any provider with an OpenAI-compatible /chat/completions endpoint can be added as a custom provider with api_type: openai-completions, regardless of whether it appears in the catalog.

Open the instance detail page and go to the Enabled Models section. Click Add provider to open the same provider modal, but scoped to that instance. The provider and its API key are stored separately from global providers and are only available to that specific instance.

Instance providers are labeled with an Instance badge in the provider list to distinguish them from global ones. You can edit or delete an instance provider from the same location.

For well-known providers, the model list is fetched from the Clawforce hosted catalog and cached locally. Click Sync Models to force a refresh of providers, models and prices.