Getting Started

    MegaRouter Documentation

    A unified AI model routing platform. One API key, 30+ models, smart auto-routing.


    Getting Started

    1. Create an API key

    1. Go to megarouter.com, choose login method, and authorize
    2. Go to Console → Settings → API keys → Create a key

    2. Auto routing (optional)

    Auto routing is enabled by default. To control it:

    Console → Settings → Routing → Auto routing toggle

    Once enabled, MegaRouter automatically selects the best model for each request. If you prefer to pick models yourself, skip this step and specify models directly (e.g. anthropic/claude-sonnet-4.6).


    Standard Setup

    Fully compatible with the OpenAI API. Supports Python, Node.js, curl, and tools across the ecosystem.

    Replace the Base URL ( https://api.megarouter.com/openai/v1 ) and API key to start using it.

    from openai import OpenAI
    
    client = OpenAI(
        api_key="MEGAROUTER_API_KEY",  # get MEGAROUTER_API_KEY from megarouter.com (API Key)
        base_url="https://api.megarouter.com/openai/v1",
    )
    
    completion = client.chat.completions.create(
        model="auto",
        messages=[
            {"role": "system", "content": "system prompt"},
            {"role": "user", "content": "how are you?"}
        ],
    )
    
    # get the response from LLM (role=assistant)
    print(completion.choices[0].message.content)

    Response example:

    {
        "id": "243c850e-214c-431e-977f-ebaf4aa95f56",
        "choices": [
            {
                "index": 0,
                "message": {
                    "role": "assistant",
                    "content": "Hello! Nice to meet you. How can I help you?"
                },
                "finish_reason": "stop"
            }
        ],
        "created": 1773408946,
        "model": "deepseek.v3-v1:0",
        "object": "chat.completion",
        "usage": {
            "prompt_tokens": 5,
            "completion_tokens": 15,
            "total_tokens": 20
        }
    }

    OpenClaw Setup

    If you already have OpenClaw installed, follow the steps below to connect MegaRouter.

    Connect MegaRouter

    Method 1: Web console

    1. Start the web console

    In a terminal, run:

    openclaw dashboard

    The browser will open the console (usually http://127.0.0.1:18789). If the browser does not open automatically, please visit that address manually.

    2. Go to configuration page

    Select Config → Raw mode.

    3. Add MegaRouter configuration

    Add env and set MEGAROUTER_API_KEY to your MegaRouter API Key:

    env: {
        vars: {
          MEGAROUTER_API_KEY: 'sk-or-v1-xxxxxxxxxxxxxxxx',
        },
      },

    Add models with baseUrl set to https://api.megarouter.com/openai/v1:

    models: {
      mode: 'merge',
      providers: {
        megarouter: {
          baseUrl: 'https://api.megarouter.com/openai/v1',
          apiKey: '${MEGAROUTER_API_KEY}',
          api: 'openai-completions',
          models: [
            {
              id: 'megarouter/auto',
              name: 'MegaRouter Auto',
              api: 'openai-completions',
              reasoning: false,
              input: ['text'],
              cost: {
                input: 0,
                output: 0,
                cacheRead: 0,
                cacheWrite: 0,
              },
              contextWindow: 200000,
              maxTokens: 8192,
            },
          ],
        },
      },
    },

    Replace the original "agents": {...} section with:

    agents: {
      defaults: {
        model: {
          primary: 'megarouter/auto',
        },
        models: {
          'megarouter/auto': {
            alias: 'MegaRouter Auto',
          },
        },
      },
    },

    4. Save and apply configuration

    Web console: Click Save in the top right, then Update.

    5. Verify connection

    In OpenClaw Chat, send a test message such as "Hello". If configured correctly MegaRouter API is called → auto-routed to the best model → response is returned.

    Method 2: Edit config file

    1. Locate the openclaw.json file

    macOS:

    Open Finder, press Command + Shift + G

    Enter: ~/.openclaw

    Press Enter to see openclaw.json.

    Windows:

    Path: C:\Users\<YourUsername>\.openclaw\openclaw.json

    2. Add MegaRouter configuration

    Add env and set MEGAROUTER_API_KEY to your MegaRouter API Key:

    "env": {
      "vars": {
        "MEGAROUTER_API_KEY": "sk-or-v1-xxxxxxxxxxxxxxxx"
      }
    },

    Add models with baseUrl set to https://api.megarouter.com/openai/v1:

    "models": {
      "mode": "merge",
      "providers": {
        "megarouter": {
          "baseUrl": "https://api.megarouter.com/openai/v1",
          "apiKey": "${MEGAROUTER_API_KEY}",
          "api": "openai-completions",
          "models": [
            {
              "id": "megarouter/auto",
              "name": "MegaRouter Auto",
              "api": "openai-completions",
              "reasoning": false,
              "input": ["text"],
              "cost": {
                "input": 0,
                "output": 0,
                "cacheRead": 0,
                "cacheWrite": 0
              },
              "contextWindow": 200000,
              "maxTokens": 8192
            }
          ]
        }
      }
    },

    Replace the original "agents": {...}, section with:

    "agents": {
      "defaults": {
        "model": {
          "primary": "megarouter/minimax/minimax-m2.5"
        },
        "models": {
          "megarouter/auto": {
            "alias": "MegaRouter Auto"
          }
        }
      }
    },

    3. Save and verify configuration

    After saving the config file, run the following in a terminal to view the file and confirm it is correct:

    cat ~/.openclaw/openclaw.json

    4. Verify connection

    Run the following in a local terminal to start a CLI conversation:

    openclaw tui

    Or run the following to use OpenClaw Chat in the browser:

    openclaw dashboard

    Optional configuration

    Auto model routing

    MegaRouter recommends setting primary to megarouter/auto.

    Automatically selects the best model by price, latency, and availability.

    Use a specific model

    To use a fixed model, e.g. set primary to megarouter/deepseek/deepseek-v3.2

    FAQ

    1. Only OpenAI models succeed; other models fail

      Models available through MegaRouter use the OpenAI-compatible protocol. In OpenClaw integration settings, set the api field to openai-completions (as in the examples above). If OpenAI-family models work but all others fail, check the providers entry: the api type.

    2. Model not found or empty response

      Confirm the model ID spelling is correct; the configured provider name matches what you reference; and reasoning must be set to false.


    QClaw Setup

    If you already have QClaw installed, follow these steps to connect MegaRouter.

    Configure in chat

    1. In the chat, send the message below. Replace the apiKey value with your MegaRouter API key.

    Help me add a new provider
    Name: MegaRouter
    apiKey: sk-or-v1-xxxxxxxxxxxxxxxx
    baseUrl: https://api.megarouter.com/openai/v1
    Models (you can pass multiple): 1. auto  2. deepseek/deepseek-v3.2

    QClaw will add the provider and restart automatically.

    2. Verify

    Ask: “Help me verify that my MegaRouter configuration is working.” The assistant should reply with something like “MegaRouter provider was added successfully!” (exact wording may vary.)

    3. Switch to MegaRouter

    Ask: “Switch to auto under MegaRouter.” The assistant should reply with something like “Switched successfully!” (exact wording may vary.)


    AutoClaw Setup

    1. Open the configuration entry

    Click Preferences in the bottom-left, go to Models & API, then click Add custom model.

    2. Add a model

    • Set the provider to Custom.
    • Enter a MegaRouter-supported model ID, e.g. deepseek/deepseek-v3.2.
    • Enter a display name, e.g. MegaRouter(deepseek-v3.2).
    • Enter your API Key, e.g. sk-or-v1-xxxxxxxxxxxxxxxx.
    • Base URL: https://api.megarouter.com/openai/v1

    3. Test the configuration

    Click the connection test. If you see “Test successful”, the setup is correct.

    4. Use the model

    • Click Add. After it saves, return to the app.
    • Below the chat input, switch the model to your configured MegaRouter(deepseek-v3.2) to use it.

    Cursor Setup

    If you already have Cursor installed, follow these steps to connect MegaRouter.

    1. Open Cursor Settings

    Use the menu in the top-right corner → Settings.

    Placeholder for a screenshot of opening Cursor Settings from the top-right menu
    Reference: Settings entry (screenshot)

    2. Models

    In the left sidebar:

    • Open Models.
    • Choose View All Models, scroll to the bottom, then Add Custom Model.
    • Enter the specific model ID, e.g. deepseek/deepseek-v3.2. Do not use auto.
    Placeholder for a screenshot of Models, View All Models, and Add Custom Model
    Reference: Models list and Add Custom Model (screenshot)

    3. Add MegaRouter

    Configure API access:

    • Expand API Keys.
    • Paste your MegaRouter API key.
    • Set Base URL to https://api.megarouter.com/openai/v1
    Placeholder for a screenshot of API Keys and Base URL fields
    Reference: API Keys and Base URL (screenshot)

    4. Save and close the Settings page.

    5. Use MegaRouter in Cursor

    In Chat, Composer, or Agent, choose your MegaRouter model from the model dropdown.

    Placeholder for a screenshot of selecting a MegaRouter model in the dropdown
    Reference: model dropdown in Chat / Composer / Agent (screenshot)

    API Reference

    FieldValue
    Base URLhttps://api.megarouter.com/openai/v1
    AuthAuthorization: Bearer <API_KEY>
    FormatOpenAI-compatible
    PricingPay-as-you-go

    Note: The API path is /openai/v1 (not /v1).

    Endpoints

    MethodPathDescription
    POST/chat/completionsChat completions (streaming supported)
    GET/modelsList available models

    Models

    Model IDDescriptionUse Case
    openai/gpt-5.2OpenAI latestReasoning tasks
    openai/gpt-5OpenAI general-purpose flagshipGeneral purpose
    openai/gpt-5-miniOpenAI lightweightGeneral / cost optimization
    openai/gpt-5-nanoOpenAI ultra low costSimple tasks
    openai/gpt-4.1OpenAI stableGeneral purpose
    openai/gpt-4.1-nanoOpenAI lightweight stableSimple tasks
    anthropic/claude-opus-4.6Anthropic's most capableComplex reasoning
    anthropic/claude-sonnet-4.6Anthropic balancedGeneral purpose
    anthropic/claude-sonnet-4.5Anthropic previous genGeneral purpose
    anthropic/claude-haiku-4.5Anthropic fastSimple tasks
    google/gemini-3.1-proGoogle latest flagshipLong context / reasoning
    google/gemini-2.5-proGoogle previous gen flagshipLong context
    deepseek/deepseek-v3.2DeepSeek latestCost-effective
    deepseek/deepseek-v3.1DeepSeek previous genGeneral purpose
    x-ai/grok-4xAI latest flagshipReasoning / real-time info
    x-ai/grok-4.1-fastxAI high-speedFast response
    moonshotai/kimi-k2.5Moonshot strong long-contextLong context
    z-ai/glm-5Z.ai latestGeneral purpose
    z-ai/glm-5-turboCoding & reasoningMulti-scenario
    z-ai/glm-4.7-flashZ.ai fast tierSimple tasks
    minimax/minimax-m2.5MiniMax multimodalGeneral purpose

    Model ID format: provider/model-name. Version numbers use . (e.g. 4.6), not -.

    For more models, visit the Models page.


    Troubleshooting

    ErrorCauseSolution
    auto routing is not enabledAuto routing not turned onOpen Dashboard → Settings → Routing, then turn on auto routing
    provider routing is not configuredWrong model ID formatOpen Docs → Models to browse the catalog
    404 page not foundWrong API pathConfirm Base URL is https://api.megarouter.com/openai/v1
    unsupported parameter: max_tokensSome models don't support itUse max_completion_tokens instead