Use OminiGate in OpenClaw
OpenClaw is a self-hosted AI assistant gateway. Register OminiGate as a provider in one JSON file and unlock every model in the catalog.
OpenClaw
Overview
OpenClaw is an open-source, self-hosted AI assistant that can route to any OpenAI-compatible or Anthropic-compatible provider. Its config lives at ~/.openclaw/openclaw.json and supports mixing multiple providers in a single installation.
Because OminiGate exposes both /v1/chat/completions (OpenAI) and /v1/messages (Anthropic), you can register it under either protocol — or both, for maximum model coverage.
- One JSON file, no binary rebuilds.
- OminiGate sits alongside any providers you already have — OpenClaw merges them into a single model picker.
- Define per-model metadata once — name, modality, context window — and OpenClaw's UI picks it up.
Prerequisites
Install OpenClaw and grab an OminiGate API key.
1. Install OpenClaw
OpenClaw ships as a global npm package. Node 18+ is required. The first run creates the ~/.openclaw directory where your config will live.
npm install -g openclaw@latest2. Get an OminiGate API key
Create one in the dashboard. The key lives directly in openclaw.json — keep the file out of any public repo.
Open API keys→Configure the provider
Register OminiGate under models.providers. Pick the protocol that matches the models you want to call — OpenAI for broad coverage, Anthropic for Claude-specific features like prompt caching. mode: "merge" keeps OpenClaw's built-in providers alongside OminiGate.
Pick the tab that matches the protocol of the models you want. openai-completions uses OminiGate's /v1/chat/completions and covers every vendor — OpenAI, Anthropic, Google, and more. anthropic-messages uses OminiGate's /v1/messages and preserves Anthropic-only features like prompt caching. Note that the apiKey field references the OMINIGATE_API_KEY environment variable — OpenClaw resolves placeholders from your shell at startup, so secrets never live literally inside JSON.
{
"agents": {
"defaults": {
"model": { "primary": "ominigate/openai/gpt-5.4-pro" }
}
},
"models": {
"mode": "merge",
"providers": {
"ominigate": {
"baseUrl": "https://api.ominigate.ai/v1",
"api": "openai-completions",
"apiKey": "${OMINIGATE_API_KEY}",
"models": [
{ "id": "openai/gpt-5.4-pro", "name": "GPT-5.4 Pro" },
{ "id": "anthropic/claude-opus-4.6", "name": "Claude Opus 4.6" },
{ "id": "google/gemini-3.1-pro-preview", "name": "Gemini 3.1 Pro (preview)" }
]
}
}
}
}Config path: ~/.openclaw/openclaw.json
Export the API key
OpenClaw reads OMINIGATE_API_KEY from your environment at startup. Put the export in ~/.zshrc or ~/.bashrc so it survives new shells.
export OMINIGATE_API_KEY="sk-omg-your-api-key"Validate, restart, and run
OpenClaw only reloads config on gateway restart, so always validate → restart → run after editing openclaw.json.
1. Validate the config
openclaw doctor checks JSON syntax, env-var resolution, and provider reachability. openclaw models list confirms your new OminiGate slugs are visible.
openclaw doctor
openclaw models list # expect to see ominigate/openai/gpt-5.4-pro etc.2. Restart the gateway
Config edits only take effect after a gateway restart. Skip this step and new providers or models won't appear in sessions.
openclaw gateway restart3. Run or switch primary model
One-shot runs take --model; models set changes the default persistently; /model swaps mid-session.
# One-shot with an explicit model
openclaw run --model ominigate/openai/gpt-5.4-pro "Review the changes in this branch"
# Or change the default primary model persistently
openclaw models set ominigate/anthropic/claude-opus-4.6
# Or swap mid-conversation
/model ominigate/google/gemini-3.1-pro-previewRecommended models
Any model in the OminiGate catalog can be added to the models array. These four work well as starter entries.
openai/gpt-5.4-proTop-tier OpenAI model. Great default for tool-use agents.
anthropic/claude-opus-4.6Anthropic's best. Register under the anthropic-messages provider for prompt caching support.
anthropic/claude-sonnet-4.6Faster Anthropic tier — strong code quality at lower latency.
google/gemini-3.1-pro-previewGoogle flagship, multimodal, long context. Routes through OminiGate's OpenAI endpoint.
Browse the full catalog at /models.
Troubleshooting
My new models don't show up in openclaw models list.
Most often the gateway hasn't been restarted yet — run openclaw gateway restart. If they're still missing, openclaw doctor will flag JSON parse errors or unresolved env vars. You can also sanity-check with jq < ~/.openclaw/openclaw.json.
I get 404 Not Found from OminiGate.
Double-check baseUrl. For openai-completions use https://api.ominigate.ai/v1 (with the /v1 suffix). For anthropic-messages use https://api.ominigate.ai (no suffix — OpenClaw appends /v1/messages itself).
A model I registered is getting rejected at call time.
If you set agents.defaults.model.models, it becomes an allowlist — only models listed there can be called, every other registered model is blocked. Either add the slug to that map, or remove the entire models key under defaults to drop the restriction.
openclaw doctor reports my API key is unresolved.
OpenClaw resolves the OMINIGATE_API_KEY placeholder from the gateway process's environment. If you're running as a systemd service, add OMINIGATE_API_KEY=... to the Environment= directive or the referenced env file. For desktop use, make sure the export lives in the shell profile that launched the gateway.
Prompt caching doesn't seem to activate.
Prompt caching only runs on the Anthropic protocol. Register Anthropic models under api: "anthropic-messages" — the OpenAI-compatible endpoint does not forward the cache_control blocks.
Next steps
Explore the full API reference or browse more models you can register in OpenClaw.