Introduction
OpenClaw is an open-source, self-hosted personal AI assistant platform that connects messaging apps to AI agents running on your own hardware. Designed for developers and power users, you can have your own autonomous AI assistant without handing over control of your data.- GitHub: https://github.com/openclaw/openclaw
- Documentation: https://docs.openclaw.ai
- Official Site: https://openclaw.ai
🌟 Core Features
Multi-Channel Integration
- Canvas UI: Renders interactive Canvas interfaces
- Voice Support: Supports voice interaction on macOS/iOS/Android
- Single Gateway: Manages all channels through one Gateway process
- Multi-Channel: Supports Telegram, Discord, WhatsApp, iMessage, and more via plugins
Self-Hosted & Data Security
- Local Data: Context and skills stored on your local machine, not in the cloud
- Open Source: MIT license, fully transparent codebase
- Fully Self-Hosted: Runs on your own machine or server
Intelligent Agent Capabilities
- Tool Calling: Native support for tool calls and code execution
- Multi-Agent Routing: Multiple agents working together
- Session Isolation: Sessions isolated by agent/workspace/sender
- Scheduled Tasks: Supports cron-style scheduling
- Persistent Runtime: Runs in the background with persistent memory
📦 Prerequisites
Requirements
- Node.js 22 or higher
- A valid MixRoute API endpoint (typically ending with
/v1) - A valid MixRoute API Key
1. Install OpenClaw (macOS/Linux)
2. Run the Onboarding Wizard
3. Verify Gateway and Control UI
4. Locate the Config File
The config file is usually at~/.openclaw/openclaw.json. You can modify it after the wizard generates it.
🚀 Use MixRoute API as Model Provider
OpenClaw supports custom or OpenAI-compatible model gateways viamodels.providers. For MixRoute API, the usual approach is to add it as a custom provider and point the default model to mixroute/model-id.
Setup Overview
- Declare a
mixrouteprovider undermodels.providers - Set
baseUrlto your MixRoute API address (include/v1) - Set
apitoopenai-completions - List the model IDs you want to use in
models - Set
agents.defaults.model.primarytomixroute/...
Recommended: Use Environment Variables for Keys
Set your MixRoute API key in your shell, service environment, or a.env file that OpenClaw can read:
openclaw.json:
Configuration Reference
| Setting | Description |
|---|---|
models.mode | Set to merge to keep built-in providers and add mixroute |
models.providers.mixroute.baseUrl | MixRoute API endpoint, e.g. https://api.mixroute.ai/v1 (include /v1) |
models.providers.mixroute.apiKey | MixRoute API key; use ${MIXROUTE_API_KEY} for env injection |
models.providers.mixroute.api | Use openai-completions for OpenAI-compatible gateways like MixRoute API |
models.providers.mixroute.models | Model IDs must match those exposed by MixRoute API |
agents.defaults.model.primary | Default model; format must be provider/model-id |
agents.defaults.model.fallbacks | Fallback models when primary fails |
agents.defaults.models | Optional; aliases for easier use in UI and sessions |
Verify Connection
After configuring, open or refresh the Control UI:mixroute/..., the setup is successful. You can also run:
mixroute/ prefix appear in the list.
Troubleshooting
- Missing
/v1in baseUrl: One of the most common setup errors. - Wrong model ID:
primaryandfallbacksmust match theidvalues inmodels.providers.mixroute.models. - Key not available to service: If Gateway runs as a service, ensure it can read
MIXROUTE_API_KEY. - Debug in foreground: Use
openclaw gateway --port 18789to run in the foreground and inspect logs.