OpenClaw
OpenClaw is a popular open-source AI agent framework used by developers for autonomous workflows, code generation, and task automation. Its derivatives — including NemoClaw, ClawdBot, and others — all support the OpenAI-compatible API format.
Point OpenClaw at your mycellm node:
{ "providers": { "mycellm": { "baseUrl": "http://localhost:8420/v1", "api": "openai-completions", "models": [ { "id": "auto", "name": "mycellm Auto", "contextWindow": 32768, "maxTokens": 4096 } ] } }}Place this in your agent’s models.json configuration.
Environment variables
Section titled “Environment variables”Alternatively, set the standard OpenAI env vars:
export OPENAI_BASE_URL=http://localhost:8420/v1export OPENAI_API_KEY=your-mycellm-key # optional if no authUsing the public network
Section titled “Using the public network”No node needed — use the public gateway directly:
{ "providers": { "mycellm-public": { "baseUrl": "https://api.mycellm.dev/v1/public", "api": "openai-completions", "models": [ { "id": "auto", "name": "mycellm Public", "contextWindow": 32768, "maxTokens": 1024 } ] } }}Multiple providers with fallback
Section titled “Multiple providers with fallback”Keep mycellm as primary and a paid provider as fallback:
{ "providers": { "mycellm": { "baseUrl": "http://localhost:8420/v1", "api": "openai-completions", "models": [ {"id": "Qwen2.5-3B-Instruct-Q8_0", "name": "Qwen 3B (local)"}, {"id": "Mistral-Small-24B-Q4_K_M", "name": "Mistral 24B (fleet)"} ] }, "openrouter": { "baseUrl": "https://openrouter.ai/api/v1", "api": "openai-completions", "apiKey": "sk-or-...", "models": [ {"id": "auto", "name": "OpenRouter Fallback"} ] } }}mycellm is listed first, so OpenClaw uses it by default. If mycellm is unavailable, it falls back to OpenRouter.
Derivatives
Section titled “Derivatives”The same configuration works with OpenClaw derivatives:
- NemoClaw — same
models.jsonformat - ClawdBot — same
models.jsonformat - Any agent built on the OpenClaw framework
Private data
Section titled “Private data”For sensitive workflows, use the --private trust flag or run a private mycellm network:
# CLImycellm chat --private
# API{"mycellm": {"trust": "local"}}This ensures prompts never leave your machine.