Skip to main content

Introduction

OpenClaw is an open-source, self-hosted personal AI assistant platform that connects messaging apps to AI agents running on your own hardware. Designed for developers and power users, you can have your own autonomous AI assistant without handing over control of your data. OpenClaw is fully open source. You can browse the source code, submit issues, or contribute on GitHub. This guide covers installation, configuration, and the complete steps to connect OpenClaw to MixRoute API.

🌟 Core Features

Multi-Channel Integration

  • Canvas UI: Renders interactive Canvas interfaces
  • Voice Support: Supports voice interaction on macOS/iOS/Android
  • Single Gateway: Manages all channels through one Gateway process
  • Multi-Channel: Supports Telegram, Discord, WhatsApp, iMessage, and more via plugins

Self-Hosted & Data Security

  • Local Data: Context and skills stored on your local machine, not in the cloud
  • Open Source: MIT license, fully transparent codebase
  • Fully Self-Hosted: Runs on your own machine or server

Intelligent Agent Capabilities

  • Tool Calling: Native support for tool calls and code execution
  • Multi-Agent Routing: Multiple agents working together
  • Session Isolation: Sessions isolated by agent/workspace/sender
  • Scheduled Tasks: Supports cron-style scheduling
  • Persistent Runtime: Runs in the background with persistent memory

📦 Prerequisites

Requirements

  • Node.js 22 or higher
  • A valid MixRoute API endpoint (typically ending with /v1)
  • A valid MixRoute API Key
Before connecting MixRoute API, we recommend running the OpenClaw Gateway and Control UI first using the official setup flow. This makes it easier to tell whether issues come from OpenClaw itself or from the model provider configuration.

1. Install OpenClaw (macOS/Linux)

curl -fsSL https://openclaw.ai/install.sh | bash
Other installation methods: Getting Started.

2. Run the Onboarding Wizard

openclaw onboard --install-daemon
The wizard completes basic auth, Gateway setup, and optional channel initialization. The goal is to get OpenClaw running first, then switch the default model to MixRoute API.

3. Verify Gateway and Control UI

openclaw gateway status
openclaw dashboard
If the Control UI opens in your browser, OpenClaw is running correctly. You don’t need to configure Telegram, Discord, or other channels at this stage.

4. Locate the Config File

The config file is usually at ~/.openclaw/openclaw.json. You can modify it after the wizard generates it.
If you run OpenClaw under a dedicated service account or want custom config/state directories, use OPENCLAW_CONFIG_PATH, OPENCLAW_STATE_DIR, and OPENCLAW_HOME. See Environment Variables.

🚀 Use MixRoute API as Model Provider

OpenClaw supports custom or OpenAI-compatible model gateways via models.providers. For MixRoute API, the usual approach is to add it as a custom provider and point the default model to mixroute/model-id.

Setup Overview

  1. Declare a mixroute provider under models.providers
  2. Set baseUrl to your MixRoute API address (include /v1)
  3. Set api to openai-completions
  4. List the model IDs you want to use in models
  5. Set agents.defaults.model.primary to mixroute/...
Set your MixRoute API key in your shell, service environment, or a .env file that OpenClaw can read:
export MIXROUTE_API_KEY="sk-your-mixroute-key"
Then add or update this section in openclaw.json:
{
  "models": {
    "mode": "merge",
    "providers": {
      "mixroute": {
        "baseUrl": "https://api.mixroute.ai/v1",
        "apiKey": "${MIXROUTE_API_KEY}",
        "api": "openai-completions",
        "models": [
          { "id": "gemini-2.5-flash", "name": "Gemini 2.5 Flash" },
          { "id": "kimi-k2.5", "name": "Kimi K2.5" }
        ]
      }
    }
  },
  "agents": {
    "defaults": {
      "model": {
        "primary": "mixroute/gemini-2.5-flash",
        "fallbacks": ["mixroute/kimi-k2.5"]
      },
      "models": {
        "mixroute/gemini-2.5-flash": { "alias": "flash" },
        "mixroute/kimi-k2.5": { "alias": "kimi" }
      }
    }
  }
}
This is the minimal config needed to connect MixRoute API. As long as the provider name, model IDs, and default model reference match, OpenClaw will use MixRoute API for your models.

Configuration Reference

SettingDescription
models.modeSet to merge to keep built-in providers and add mixroute
models.providers.mixroute.baseUrlMixRoute API endpoint, e.g. https://api.mixroute.ai/v1 (include /v1)
models.providers.mixroute.apiKeyMixRoute API key; use ${MIXROUTE_API_KEY} for env injection
models.providers.mixroute.apiUse openai-completions for OpenAI-compatible gateways like MixRoute API
models.providers.mixroute.modelsModel IDs must match those exposed by MixRoute API
agents.defaults.model.primaryDefault model; format must be provider/model-id
agents.defaults.model.fallbacksFallback models when primary fails
agents.defaults.modelsOptional; aliases for easier use in UI and sessions

Verify Connection

After configuring, open or refresh the Control UI:
openclaw dashboard
If you can start a conversation and the default model shows mixroute/..., the setup is successful. You can also run:
openclaw models list
to confirm that models with the mixroute/ prefix appear in the list.

Troubleshooting

  • Missing /v1 in baseUrl: One of the most common setup errors.
  • Wrong model ID: primary and fallbacks must match the id values in models.providers.mixroute.models.
  • Key not available to service: If Gateway runs as a service, ensure it can read MIXROUTE_API_KEY.
  • Debug in foreground: Use openclaw gateway --port 18789 to run in the foreground and inspect logs.