First Chat
Zero-config chat
Section titled “Zero-config chat”After installing, just run:
mycellm chatThis automatically discovers available models:
- Local node — checks
localhost:8420for loaded models - LAN bootstrap — reads
MYCELLM_BOOTSTRAP_PEERSfrom config - Public network — falls back to
api.mycellm.dev
No configuration needed for first use.
Chat REPL
Section titled “Chat REPL” mycellm_ chat ──────────────────────────────────────── Model: Qwen2.5-3B-Instruct-Q8_0 Node: http://10.1.1.210:8420 Type /help for commands, /q to exit
╭──│ What is distributed computing?╰──
Distributed computing is a model where multiple computers worktogether to solve a problem...
Qwen2.5-3B-Instruct-Q8_0 · via node 99e58f4c · 485msFeatures:
- Streaming — tokens appear as they’re generated
- Markdown rendering — code blocks with syntax highlighting
- Per-message attribution — model name, node hash, latency
- Multi-turn — conversation context maintained
- Slash commands — manage your node inline
Slash Commands
Section titled “Slash Commands”| Command | Description |
|---|---|
/help | Show all commands |
/status | Node status (name, peers, models, uptime) |
/models | List available models |
/credits | Credit balance (earned/spent) |
/fleet | Fleet nodes with online status |
/config | Runtime configuration |
/use <model> | Switch to a specific model |
/clear | Clear conversation history |
/q | Exit |
Use as an API
Section titled “Use as an API”Any tool that speaks the OpenAI protocol works:
Python
Section titled “Python”from openai import OpenAI
client = OpenAI( base_url="http://localhost:8420/v1", api_key="your-key", # optional)response = client.chat.completions.create( model="auto", messages=[{"role": "user", "content": "Hello"}],)print(response.choices[0].message.content)curl http://localhost:8420/v1/chat/completions \ -H "Content-Type: application/json" \ -d '{ "model": "auto", "messages": [{"role": "user", "content": "Hello"}] }'Environment variables
Section titled “Environment variables”export OPENAI_BASE_URL=http://localhost:8420/v1export OPENAI_API_KEY=your-keyexport OPENAI_MODEL=auto