MegaRouter Documentation
A unified AI model routing platform. One API key, 30+ models, smart auto-routing.
Getting Started
1. Create an API key
- Go to megarouter.com, choose login method, and authorize
- Go to Console → Settings → API keys → Create a key
2. Auto routing (optional)
Auto routing is enabled by default. To control it:
Console → Settings → Routing → Auto routing toggle
Once enabled, MegaRouter automatically selects the best model for each request. If you prefer to pick models yourself, skip this step and specify models directly (e.g. anthropic/claude-sonnet-4.6).
Standard Setup
Fully compatible with the OpenAI API. Supports Python, Node.js, curl, and tools across the ecosystem.
Replace the Base URL ( https://api.megarouter.com/openai/v1 ) and API key to start using it.
from openai import OpenAI
client = OpenAI(
api_key="MEGAROUTER_API_KEY", # get MEGAROUTER_API_KEY from megarouter.com (API Key)
base_url="https://api.megarouter.com/openai/v1",
)
completion = client.chat.completions.create(
model="auto",
messages=[
{"role": "system", "content": "system prompt"},
{"role": "user", "content": "how are you?"}
],
)
# get the response from LLM (role=assistant)
print(completion.choices[0].message.content)Response example:
{
"id": "243c850e-214c-431e-977f-ebaf4aa95f56",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "Hello! Nice to meet you. How can I help you?"
},
"finish_reason": "stop"
}
],
"created": 1773408946,
"model": "deepseek.v3-v1:0",
"object": "chat.completion",
"usage": {
"prompt_tokens": 5,
"completion_tokens": 15,
"total_tokens": 20
}
}OpenClaw Setup
If you already have OpenClaw installed, follow the steps below to connect MegaRouter.
Connect MegaRouter
Method 1: Web console
1. Start the web console
In a terminal, run:
openclaw dashboardThe browser will open the console (usually http://127.0.0.1:18789). If the browser does not open automatically, please visit that address manually.
2. Go to configuration page
Select Config → Raw mode.
3. Add MegaRouter configuration
Add env and set MEGAROUTER_API_KEY to your MegaRouter API Key:
env: {
vars: {
MEGAROUTER_API_KEY: 'sk-or-v1-xxxxxxxxxxxxxxxx',
},
},Add models with baseUrl set to https://api.megarouter.com/openai/v1:
models: {
mode: 'merge',
providers: {
megarouter: {
baseUrl: 'https://api.megarouter.com/openai/v1',
apiKey: '${MEGAROUTER_API_KEY}',
api: 'openai-completions',
models: [
{
id: 'megarouter/auto',
name: 'MegaRouter Auto',
api: 'openai-completions',
reasoning: false,
input: ['text'],
cost: {
input: 0,
output: 0,
cacheRead: 0,
cacheWrite: 0,
},
contextWindow: 200000,
maxTokens: 8192,
},
],
},
},
},Replace the original "agents": {...} section with:
agents: {
defaults: {
model: {
primary: 'megarouter/auto',
},
models: {
'megarouter/auto': {
alias: 'MegaRouter Auto',
},
},
},
},4. Save and apply configuration
Web console: Click Save in the top right, then Update.
5. Verify connection
In OpenClaw Chat, send a test message such as "Hello". If configured correctly MegaRouter API is called → auto-routed to the best model → response is returned.
Method 2: Edit config file
1. Locate the openclaw.json file
macOS:
Open Finder, press Command + Shift + G
Enter: ~/.openclaw
Press Enter to see openclaw.json.
Windows:
Path: C:\Users\<YourUsername>\.openclaw\openclaw.json
2. Add MegaRouter configuration
Add env and set MEGAROUTER_API_KEY to your MegaRouter API Key:
"env": {
"vars": {
"MEGAROUTER_API_KEY": "sk-or-v1-xxxxxxxxxxxxxxxx"
}
},Add models with baseUrl set to https://api.megarouter.com/openai/v1:
"models": {
"mode": "merge",
"providers": {
"megarouter": {
"baseUrl": "https://api.megarouter.com/openai/v1",
"apiKey": "${MEGAROUTER_API_KEY}",
"api": "openai-completions",
"models": [
{
"id": "megarouter/auto",
"name": "MegaRouter Auto",
"api": "openai-completions",
"reasoning": false,
"input": ["text"],
"cost": {
"input": 0,
"output": 0,
"cacheRead": 0,
"cacheWrite": 0
},
"contextWindow": 200000,
"maxTokens": 8192
}
]
}
}
},Replace the original "agents": {...}, section with:
"agents": {
"defaults": {
"model": {
"primary": "megarouter/minimax/minimax-m2.5"
},
"models": {
"megarouter/auto": {
"alias": "MegaRouter Auto"
}
}
}
},3. Save and verify configuration
After saving the config file, run the following in a terminal to view the file and confirm it is correct:
cat ~/.openclaw/openclaw.json4. Verify connection
Run the following in a local terminal to start a CLI conversation:
openclaw tuiOr run the following to use OpenClaw Chat in the browser:
openclaw dashboardOptional configuration
Auto model routing
MegaRouter recommends setting primary to megarouter/auto.
Automatically selects the best model by price, latency, and availability.
Use a specific model
To use a fixed model, e.g. set primary to megarouter/deepseek/deepseek-v3.2
FAQ
Only OpenAI models succeed; other models fail
Models available through MegaRouter use the OpenAI-compatible protocol. In OpenClaw integration settings, set the api field to
openai-completions(as in the examples above). If OpenAI-family models work but all others fail, check theprovidersentry: theapitype.Model not found or empty response
Confirm the model ID spelling is correct; the configured
providername matches what you reference; andreasoningmust be set tofalse.
QClaw Setup
If you already have QClaw installed, follow these steps to connect MegaRouter.
Configure in chat
1. In the chat, send the message below. Replace the apiKey value with your MegaRouter API key.
Help me add a new provider
Name: MegaRouter
apiKey: sk-or-v1-xxxxxxxxxxxxxxxx
baseUrl: https://api.megarouter.com/openai/v1
Models (you can pass multiple): 1. auto 2. deepseek/deepseek-v3.2QClaw will add the provider and restart automatically.
2. Verify
Ask: “Help me verify that my MegaRouter configuration is working.” The assistant should reply with something like “MegaRouter provider was added successfully!” (exact wording may vary.)
3. Switch to MegaRouter
Ask: “Switch to auto under MegaRouter.” The assistant should reply with something like “Switched successfully!” (exact wording may vary.)
AutoClaw Setup
1. Open the configuration entry
Click Preferences in the bottom-left, go to Models & API, then click Add custom model.
2. Add a model
- Set the provider to Custom.
- Enter a MegaRouter-supported model ID, e.g. deepseek/deepseek-v3.2.
- Enter a display name, e.g. MegaRouter(deepseek-v3.2).
- Enter your API Key, e.g. sk-or-v1-xxxxxxxxxxxxxxxx.
- Base URL: https://api.megarouter.com/openai/v1
3. Test the configuration
Click the connection test. If you see “Test successful”, the setup is correct.
4. Use the model
- Click Add. After it saves, return to the app.
- Below the chat input, switch the model to your configured
MegaRouter(deepseek-v3.2)to use it.
Cursor Setup
If you already have Cursor installed, follow these steps to connect MegaRouter.
1. Open Cursor Settings
Use the menu in the top-right corner → Settings.

2. Models
In the left sidebar:
- Open Models.
- Choose View All Models, scroll to the bottom, then Add Custom Model.
- Enter the specific model ID, e.g. deepseek/deepseek-v3.2. Do not use auto.

3. Add MegaRouter
Configure API access:
- Expand API Keys.
- Paste your MegaRouter API key.
- Set Base URL to https://api.megarouter.com/openai/v1

4. Save and close the Settings page.
5. Use MegaRouter in Cursor
In Chat, Composer, or Agent, choose your MegaRouter model from the model dropdown.

API Reference
| Field | Value |
|---|---|
| Base URL | https://api.megarouter.com/openai/v1 |
| Auth | Authorization: Bearer <API_KEY> |
| Format | OpenAI-compatible |
| Pricing | Pay-as-you-go |
Note: The API path is /openai/v1 (not /v1).
Endpoints
| Method | Path | Description |
|---|---|---|
| POST | /chat/completions | Chat completions (streaming supported) |
| GET | /models | List available models |
Models
| Model ID | Description | Use Case |
|---|---|---|
| openai/gpt-5.2 | OpenAI latest | Reasoning tasks |
| openai/gpt-5 | OpenAI general-purpose flagship | General purpose |
| openai/gpt-5-mini | OpenAI lightweight | General / cost optimization |
| openai/gpt-5-nano | OpenAI ultra low cost | Simple tasks |
| openai/gpt-4.1 | OpenAI stable | General purpose |
| openai/gpt-4.1-nano | OpenAI lightweight stable | Simple tasks |
| anthropic/claude-opus-4.6 | Anthropic's most capable | Complex reasoning |
| anthropic/claude-sonnet-4.6 | Anthropic balanced | General purpose |
| anthropic/claude-sonnet-4.5 | Anthropic previous gen | General purpose |
| anthropic/claude-haiku-4.5 | Anthropic fast | Simple tasks |
| google/gemini-3.1-pro | Google latest flagship | Long context / reasoning |
| google/gemini-2.5-pro | Google previous gen flagship | Long context |
| deepseek/deepseek-v3.2 | DeepSeek latest | Cost-effective |
| deepseek/deepseek-v3.1 | DeepSeek previous gen | General purpose |
| x-ai/grok-4 | xAI latest flagship | Reasoning / real-time info |
| x-ai/grok-4.1-fast | xAI high-speed | Fast response |
| moonshotai/kimi-k2.5 | Moonshot strong long-context | Long context |
| z-ai/glm-5 | Z.ai latest | General purpose |
| z-ai/glm-5-turbo | Coding & reasoning | Multi-scenario |
| z-ai/glm-4.7-flash | Z.ai fast tier | Simple tasks |
| minimax/minimax-m2.5 | MiniMax multimodal | General purpose |
Model ID format: provider/model-name. Version numbers use . (e.g. 4.6), not -.
For more models, visit the Models page.
Troubleshooting
| Error | Cause | Solution |
|---|---|---|
| auto routing is not enabled | Auto routing not turned on | Open Dashboard → Settings → Routing, then turn on auto routing |
| provider routing is not configured | Wrong model ID format | Open Docs → Models to browse the catalog |
| 404 page not found | Wrong API path | Confirm Base URL is https://api.megarouter.com/openai/v1 |
| unsupported parameter: max_tokens | Some models don't support it | Use max_completion_tokens instead |