Skip to main content

Providers

PicoClaw supports many LLM protocol families through model_list.

For a smoother and more intuitive setup experience, we recommend using Web UI as the primary way to configure models.

Web UI Model Setup

Supported Providers

ProviderPurposeGet API Key
OpenAIGPT modelsplatform.openai.com
AnthropicClaude modelsconsole.anthropic.com
Anthropic MessagesNative Anthropic Messages APIconsole.anthropic.com
Venice AIVenice AI modelsvenice.ai
Google GeminiGemini modelsaistudio.google.com
Zhipu AIGLM models (CN)bigmodel.cn
Z.AIZ.AI Coding Plan (GLM)z.ai
DeepSeekDeepSeek modelsplatform.deepseek.com
GroqFast inference + Whisperconsole.groq.com
OpenRouterAccess to all modelsopenrouter.ai
MoonshotKimi modelsplatform.moonshot.cn
QwenTongyi Qianwendashscope.console.aliyun.com
NVIDIANVIDIA AI modelsbuild.nvidia.com
MistralMistral modelsconsole.mistral.ai
AvianAvian modelsavian.io
LongCatLongCat modelslongcat.chat
ModelScopeModelScope modelsmodelscope.cn
NovitaNovita modelsnovita.ai
VivgridVivgrid hosted modelsvivgrid.com
ShengSuanYunShengSuanYun modelsrouter.shengsuanyun.com
Xiaomi MiMoMiMo modelsplatform.xiaomimimo.com
OllamaLocal model serverLocal (no key needed)
LM StudioLocal model server (OpenAI-compatible)Local (default no key)
vLLMLocal model server (OpenAI-compatible)Local
LiteLLMLiteLLM proxyLocal proxy
CerebrasFast inferencecerebras.ai
VolcEngineDoubao modelsconsole.volcengine.com
Azure OpenAIAzure-hosted OpenAI modelsAzure Portal
AWS BedrockBedrock-hosted modelsAWS Console
AntigravityGoogle Cloud Code AssistOAuth only
MinimaxMiniMax modelsplatform.minimaxi.com
GitHub CopilotCopilot bridge models
Claude CLI / Codex CLILocal CLI model bridgesLocal CLI auth

Quick Setup

{
"model_list": [
{
"model_name": "my-model",
"model": "openai/gpt-5.4",
"api_keys": ["sk-..."]
}
],
"agents": {
"defaults": {
"model_name": "my-model"
}
}
}

See Model Configuration for full details.

Z.AI Coding Plan Example

Z.AI and Zhipu AI are two brands of the same provider. For the Z.AI Coding Plan, use the openai model prefix with the Z.AI API base:

{
"model_name": "glm-4.7",
"model": "openai/glm-4.7",
"api_keys": ["your-z.ai-key"],
"api_base": "https://api.z.ai/api/coding/paas/v4"
}

If the standard Zhipu endpoint returns 429 (insufficient balance), the Z.AI Coding Plan endpoint may have available balance since they use separate billing.

Voice Transcription

You can configure a dedicated model for audio transcription with voice.model_name. This lets you reuse existing multimodal providers that support audio input instead of relying only on Groq Whisper.

If voice.model_name is not configured, PicoClaw falls back to Groq transcription when a Groq API key is available.

{
"voice": {
"model_name": "voice-gemini",
"echo_transcription": false
}
}

Model Failover Cascade

PicoClaw supports automatic failover when you configure a primary model with fallback models. The runtime retries the next candidate for retriable failures such as HTTP 429, quota/rate-limit errors, and timeouts. It also applies cooldown tracking per candidate to avoid immediately retrying a recently failed target.

{
"model_list": [
{
"model_name": "qwen-main",
"model": "openai/qwen3.5:cloud",
"api_base": "https://api.example.com/v1",
"api_keys": ["sk-main"]
},
{
"model_name": "deepseek-backup",
"model": "deepseek/deepseek-chat",
"api_keys": ["sk-backup-1"]
},
{
"model_name": "gemini-backup",
"model": "gemini/gemini-2.5-flash",
"api_keys": ["sk-backup-2"]
}
],
"agents": {
"defaults": {
"model": {
"primary": "qwen-main",
"fallbacks": ["deepseek-backup", "gemini-backup"]
}
}
}
}

If you use key-level failover for the same model (multiple keys in api_keys), PicoClaw can chain through additional key-backed candidates before moving to cross-model backups.

Special Providers

  • Antigravity — Google Cloud Code Assist, uses OAuth instead of API keys
  • Groq — also provides free voice transcription (Whisper) for Telegram voice messages