Skip to content

OpenClaw

OpenClaw is a popular open-source AI agent framework used by developers for autonomous workflows, code generation, and task automation. Its derivatives — including NemoClaw, ClawdBot, and others — all support the OpenAI-compatible API format.

Point OpenClaw at your mycellm node:

{
"providers": {
"mycellm": {
"baseUrl": "http://localhost:8420/v1",
"api": "openai-completions",
"models": [
{
"id": "auto",
"name": "mycellm Auto",
"contextWindow": 32768,
"maxTokens": 4096
}
]
}
}
}

Place this in your agent’s models.json configuration.

Alternatively, set the standard OpenAI env vars:

Terminal window
export OPENAI_BASE_URL=http://localhost:8420/v1
export OPENAI_API_KEY=your-mycellm-key # optional if no auth

No node needed — use the public gateway directly:

{
"providers": {
"mycellm-public": {
"baseUrl": "https://api.mycellm.dev/v1/public",
"api": "openai-completions",
"models": [
{
"id": "auto",
"name": "mycellm Public",
"contextWindow": 32768,
"maxTokens": 1024
}
]
}
}
}

Keep mycellm as primary and a paid provider as fallback:

{
"providers": {
"mycellm": {
"baseUrl": "http://localhost:8420/v1",
"api": "openai-completions",
"models": [
{"id": "Qwen2.5-3B-Instruct-Q8_0", "name": "Qwen 3B (local)"},
{"id": "Mistral-Small-24B-Q4_K_M", "name": "Mistral 24B (fleet)"}
]
},
"openrouter": {
"baseUrl": "https://openrouter.ai/api/v1",
"api": "openai-completions",
"apiKey": "sk-or-...",
"models": [
{"id": "auto", "name": "OpenRouter Fallback"}
]
}
}
}

mycellm is listed first, so OpenClaw uses it by default. If mycellm is unavailable, it falls back to OpenRouter.

The same configuration works with OpenClaw derivatives:

  • NemoClaw — same models.json format
  • ClawdBot — same models.json format
  • Any agent built on the OpenClaw framework

For sensitive workflows, use the --private trust flag or run a private mycellm network:

Terminal window
# CLI
mycellm chat --private
# API
{"mycellm": {"trust": "local"}}

This ensures prompts never leave your machine.