Nyzhi is provider-agnostic. It ships with 16 built-in providers and supports any OpenAI-compatible endpoint as a custom provider. Swap models by changing one line of config.
Built-in Provider Table
| Provider | ID | API Style | Env Variable | OAuth | Type |
|---|---|---|---|---|---|
| Kimi (Moonshot) | kimi | openai | MOONSHOT_API_KEY | No | Open |
| Kimi Coding Plan | kimi-coding | anthropic | KIMI_CODING_API_KEY | No | Open |
| MiniMax | minimax | openai | MINIMAX_API_KEY | No | Open |
| MiniMax Coding Plan | minimax-coding | anthropic | MINIMAX_CODING_API_KEY | No | Open |
| GLM (Z.ai) | glm | openai | ZHIPU_API_KEY | No | Open |
| GLM Coding Plan | glm-coding | openai | ZHIPU_CODING_API_KEY | No | Open |
| DeepSeek | deepseek | openai | DEEPSEEK_API_KEY | No | Open |
| Groq | groq | openai | GROQ_API_KEY | No | Cloud |
| Together AI | together | openai | TOGETHER_API_KEY | No | Cloud |
| Ollama | ollama | openai | OLLAMA_API_KEY | No | Local |
| OpenAI | openai | openai | OPENAI_API_KEY | Yes | Proprietary |
| Anthropic | anthropic | anthropic | ANTHROPIC_API_KEY | Yes | Proprietary |
| Google Gemini | gemini | gemini | GEMINI_API_KEY | Yes | Proprietary |
| OpenRouter | openrouter | openai | OPENROUTER_API_KEY | No | Aggregator |
| Cursor | cursor | cursor | CURSOR_API_KEY | Yes | Proprietary |
| GitHub Copilot | github-copilot | copilot | GITHUB_COPILOT_TOKEN | Yes | Proprietary |
Plus two SDK-style providers:
| Provider | ID | Fallback |
|---|---|---|
| Claude Agent SDK | claude-sdk | Falls back to anthropic credential |
| OpenAI Codex CLI | codex | Falls back to openai credential |
Quick Setup Examples
Kimi (Moonshot)
export MOONSHOT_API_KEY="sk-..."
[provider]
default = "kimi"
[provider.kimi]
model = "moonshot-v1-128k"
Kimi excels at long-context tasks (128k tokens). The Coding Plan variant (kimi-coding) uses Anthropic-style API for structured tool use.
MiniMax
export MINIMAX_API_KEY="..."
[provider]
default = "minimax"
[provider.minimax]
model = "abab7-chat"
MiniMax provides strong multilingual support. The Coding Plan variant is available at minimax-coding.
GLM (Zhipu / Z.ai)
export ZHIPU_API_KEY="..."
[provider]
default = "glm"
[provider.glm]
model = "glm-4-plus"
GLM offers competitive code generation through its Z.ai platform. Coding Plan variant available at glm-coding.
DeepSeek
export DEEPSEEK_API_KEY="sk-..."
[provider]
default = "deepseek"
[provider.deepseek]
model = "deepseek-chat"
DeepSeek provides strong coding performance at competitive pricing.
Groq
export GROQ_API_KEY="gsk_..."
[provider]
default = "groq"
[provider.groq]
model = "llama-3.3-70b-versatile"
Groq runs open-source models on custom LPU hardware for extremely fast inference.
Together AI
export TOGETHER_API_KEY="..."
[provider]
default = "together"
[provider.together]
model = "meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo"
Together AI provides access to a wide catalog of open-source models with serverless inference.
Ollama (Local)
No API key needed — just have Ollama running locally:
# Install and start Ollama
ollama serve
# Pull a model
ollama pull codestral
[provider]
default = "ollama"
[provider.ollama]
base_url = "http://localhost:11434/v1"
model = "codestral"
Tip: Ollama is completely offline. No data leaves your machine.
OpenAI
export OPENAI_API_KEY="sk-..."
[provider]
default = "openai"
[provider.openai]
model = "gpt-5.3-codex"
Or use OAuth:
nyz login openai
Anthropic
export ANTHROPIC_API_KEY="sk-ant-..."
[provider]
default = "anthropic"
[provider.anthropic]
model = "claude-sonnet-4-20250514"
Google Gemini
export GEMINI_API_KEY="..."
[provider]
default = "gemini"
[provider.gemini]
model = "gemini-2.5-pro"
OpenRouter
Access any model through a single API:
export OPENROUTER_API_KEY="sk-or-..."
[provider]
default = "openrouter"
[provider.openrouter]
model = "anthropic/claude-sonnet-4-20250514"
Custom / OpenAI-Compatible Providers
Any API that follows the OpenAI chat completions format works:
[provider.my-custom]
base_url = "https://my-company.example.com/v1"
api_key = "sk-..."
model = "internal-model-v3"
api_style = "openai"
Then set it as default:
[provider]
default = "my-custom"
Provider Selection at Runtime
How Nyzhi picks which provider and model to use:
Provider:
--providerCLI flag (if set)config.provider.default
Model:
--modelCLI flag (if set)[provider.<id>].model(if set)- First supported model for the provider (fallback)
Model Routing
When routing is enabled, Nyzhi automatically picks different models based on task complexity:
[agent.routing]
enabled = true
low_keywords = ["fix typo", "rename"]
high_keywords = ["refactor", "architect", "redesign"]
Prompts are classified into tiers (low, medium, high), and the provider selects an appropriate model for each tier.
See Routing for full details.
Using Multiple Providers
You can define several providers and switch between them:
[provider]
default = "anthropic"
[provider.anthropic]
model = "claude-sonnet-4-20250514"
[provider.openai]
model = "gpt-5.3-codex"
[provider.deepseek]
model = "deepseek-chat"
[provider.ollama]
base_url = "http://localhost:11434/v1"
model = "codestral"
Switch at runtime without changing config:
nyz --provider openai
nyz --provider deepseek --model deepseek-coder
Next Steps
- Authentication — API keys, OAuth, multi-account rotation
- Configuration — full config reference
- Routing — automatic model selection by complexity