mixflow.ai
Mixflow Admin Technology 6 min read

Running OpenClaw with Mixflow Provider on Mac

A step-by-step guide to setting up OpenClaw on your Mac using Mixflow as the AI provider. Route requests to GPT-5.2 Codex, Claude Opus 4.6, and Gemini Pro 3 through a single unified gateway.

OpenClaw is an open-source AI coding agent that runs entirely in your terminal. By default it connects directly to model providers, but with Mixflow you can route all your AI requests through a single gateway — giving you unified access to GPT-5.2 Codex, Claude Opus 4.6, and Gemini Pro 3 without managing multiple API keys or endpoints.

This guide walks you through getting OpenClaw running on macOS with Mixflow as the provider.

Prerequisites

Before you begin, make sure you have the following installed on your Mac:

  • Git — to clone the repository
  • Docker Desktop for Mac — OpenClaw’s gateway runs in a container
  • A Mixflow account with an API key (sign up at mixflow.ai)

Step 1: Clone the OpenClaw Repository

Open your terminal and clone the official OpenClaw repo:

git clone https://github.com/openclaw-ai/openclaw.git
cd openclaw

Step 2: Run the Docker Setup Script

OpenClaw ships with a setup script that pulls the required images and configures the environment:

./docker-setup.sh

Follow the on-screen prompts. The script will walk you through initial configuration including setting up networking and volume mounts for the gateway container.

Step 3: Start the OpenClaw Gateway

From the root of the cloned repository, bring up the gateway service:

docker compose up -d openclaw-gateway

The -d flag runs it in detached mode so it continues running in the background. You can verify it’s up with:

docker compose ps

You should see openclaw-gateway listed with a status of running.

Step 4: Configure OpenClaw to Use Mixflow

Now comes the key part — pointing OpenClaw at Mixflow so all model requests are routed through your Mixflow account.

Open the OpenClaw configuration file:

open ~/.openclaw/openclaw.json

Or use your preferred editor:

code ~/.openclaw/openclaw.json
# or
nano ~/.openclaw/openclaw.json

Replace the contents with the following configuration (or merge the relevant sections into your existing config):

{
  "meta": {
    "lastTouchedVersion": "2026.2.6",
    "lastTouchedAt": "2026-02-07T05:35:40.720Z"
  },
  "wizard": {
    "lastRunAt": "2026-02-07T05:35:40.712Z",
    "lastRunVersion": "2026.2.6",
    "lastRunCommand": "onboard",
    "lastRunMode": "local"
  },
  "models": {
    "providers": {
      "mixflow-codex": {
        "baseUrl": "https://app.mixflow.ai/api/mixflow/v1/",
        "apiKey": "YOUR_MIXFLOW_API_KEY",
        "api": "openai-responses",
        "models": [
          {
            "id": "gpt-5.2-codex",
            "name": "gpt-5.2-codex",
            "reasoning": false,
            "input": ["text"],
            "cost": {
              "input": 0,
              "output": 0,
              "cacheRead": 0,
              "cacheWrite": 0
            },
            "contextWindow": 200000,
            "maxTokens": 8192
          }
        ]
      },
      "mixflow-claude": {
        "baseUrl": "https://app.mixflow.ai/api/anthropic",
        "apiKey": "YOUR_MIXFLOW_API_KEY",
        "api": "anthropic-messages",
        "models": [
          {
            "id": "claude-opus-4.6",
            "name": "claude-opus-4.6",
            "reasoning": false,
            "input": ["text"],
            "cost": {
              "input": 0,
              "output": 0,
              "cacheRead": 0,
              "cacheWrite": 0
            },
            "contextWindow": 200000,
            "maxTokens": 8192
          }
        ]
      },
      "mixflow-gemini": {
        "baseUrl": "https://app.mixflow.ai/api/gemini/v1beta/models/gemini-pro-3",
        "apiKey": "YOUR_MIXFLOW_API_KEY",
        "api": "google-generative-ai",
        "models": [
          {
            "id": "gemini-pro-3",
            "name": "gemini-pro-3",
            "reasoning": false,
            "input": ["text"],
            "cost": {
              "input": 0,
              "output": 0,
              "cacheRead": 0,
              "cacheWrite": 0
            },
            "contextWindow": 200000,
            "maxTokens": 8192
          }
        ]
      }
    }
  },
  "agents": {
    "defaults": {
      "model": {
        "primary": "mixflow-gemini/gemini-pro-3"
      },
      "models": {
        "mixflow/gpt-5-2": {
          "alias": "gpt-5.2-codex"
        },
        "mixflow/opus-4-6": {
          "alias": "opus-4.6"
        },
        "mixflow/gemini-pro-3": {
          "alias": "gemini-pro-3"
        }
      },
      "workspace": "/home/node/.openclaw/workspace",
      "compaction": {
        "mode": "safeguard"
      },
      "maxConcurrent": 4,
      "subagents": {
        "maxConcurrent": 8
      }
    }
  },
  "messages": {
    "ackReactionScope": "group-mentions"
  },
  "commands": {
    "native": "auto",
    "nativeSkills": "auto"
  },
  "gateway": {
    "port": 18789,
    "mode": "local",
    "bind": "auto",
    "controlUi": {
      "enabled": true,
      "allowInsecureAuth": true
    },
    "auth": {
      "mode": "token",
      "token": "YOUR_GATEWAY_AUTH_TOKEN"
    },
    "tailscale": {
      "mode": "off",
      "resetOnExit": false
    }
  }
}

Important: Replace YOUR_MIXFLOW_API_KEY with your actual Mixflow API key (it starts with sk-mixflow-) and YOUR_GATEWAY_AUTH_TOKEN with a secure random token for gateway authentication.

Understanding the Configuration

Let’s break down the key sections:

Model Providers

The models.providers block defines three Mixflow-backed providers:

  • mixflow-codex — Routes to GPT-5.2 Codex via the OpenAI Responses API format. The base URL points at your local Mixflow instance through Docker’s internal networking (host.docker.internal:3000).
  • mixflow-claude — Routes to Claude Opus 4.6 via the Anthropic Messages API format. Note the different API endpoint path (/api/anthropic).
  • mixflow-gemini — Routes to Gemini Pro 3 via the Google Generative AI API format with the model specified in the URL path.

All three share the same Mixflow API key, meaning you only need one account to access all providers.

Agent Defaults

The agents.defaults section configures how OpenClaw’s agents use these models:

  • Primary model is set to mixflow-gemini/gemini-pro-3 — this is the default model agents will use
  • Model aliases let you reference models by short names (e.g., mixflow/gpt-5-2 resolves to gpt-5.2-codex)
  • Concurrency is set to 4 concurrent agents with up to 8 concurrent subagents
  • Compaction mode is set to safeguard to prevent context window overflow

Gateway Settings

The gateway block controls the OpenClaw gateway process:

  • Runs on port 18789 in local mode
  • Control UI is enabled — you can access it in your browser to monitor requests
  • Authentication uses a token-based scheme
  • Tailscale networking is disabled (for local-only setups)

Step 5: Verify the Setup

Restart the gateway to pick up the new configuration:

docker compose restart openclaw-gateway

Then start an OpenClaw session:

openclaw

You should see the agent initialize and connect through Mixflow. Try a simple prompt to confirm everything is working:

> Write a hello world script in Python

If the agent responds with generated code, your setup is working correctly. All requests are now flowing through your local Mixflow gateway to the model provider of your choice.

Switching Between Models

You can change the default model by updating the agents.defaults.model.primary field in your configuration:

  • "mixflow-codex/gpt-5.2-codex" — Use GPT-5.2 Codex as the primary model
  • "mixflow-claude/claude-opus-4.6" — Use Claude Opus 4.6 as the primary model
  • "mixflow-gemini/gemini-pro-3" — Use Gemini Pro 3 as the primary model

After changing the config, restart the gateway and start a new OpenClaw session to use the updated model.

Troubleshooting

Gateway won’t start: Make sure Docker Desktop is running and you have enough resources allocated (at least 4 GB of RAM recommended).

Connection refused errors: Verify the Mixflow service is accessible at localhost:3000. The host.docker.internal hostname is Docker’s way of reaching the host machine from inside a container.

Authentication failures: Double-check your Mixflow API key. It should start with sk-mixflow- and must be the same across all three provider entries.

Models not responding: Ensure the model IDs in your config match what’s available on your Mixflow plan. You can verify available models in the Mixflow dashboard.

127 people viewing now
$199/year Valentine's Sale: $79/year 60% OFF
Bonus $100 Codex Credits · $25 Claude Credits · $25 Gemini Credits
Offer ends in:
00 d
00 h
00 m
00 s

The #1 VIRAL AI Platform As Seen on TikTok!

REMIX anything. Stay in your FLOW. Built for Lawyers

12,847 users this month
★★★★★ 4.9/5 from 2,000+ reviews
30-day money-back Secure checkout Instant access

Summary

With this setup you get the best of both worlds: OpenClaw’s powerful terminal-based coding agent paired with Mixflow’s unified model gateway. You can seamlessly switch between GPT-5.2 Codex, Claude Opus 4.6, and Gemini Pro 3 without juggling separate API keys or changing endpoint configurations. Everything routes through a single Mixflow API key and your local Docker gateway handles the rest.

Back to Blog

Related Posts

View All Posts »