AI coding agent, built for the terminal.
# YOLO
curl -fsSL https://opencode.ai/install | bash
# Package managers
npm i -g opencode-ai@latest # or bun/pnpm/yarn
brew install sst/tap/opencode # macOS
paru -S opencode-bin # Arch Linux
Note: Remove previous versions < 0.1.x first if installed
The recommended approach is to sign up for claude pro or max and do opencode auth login
and select Anthropic. It is the most cost effective way to use this tool.
Additionally opencode is powered by the provider list at models.dev so you can use opencode auth login
to configure api keys for any provider you'd like to use. This is stored in ~/.local/share/opencode/auth.json
$ opencode auth login
┌ Add credential
│
◆ Select provider
│ ● Anthropic (recommended)
│ ○ OpenAI
│ ○ Google
│ ○ Amazon Bedrock
│ ○ Azure
│ ○ DeepSeek
│ ○ Groq
│ ...
└
The models.dev dataset is also used to detect common environment variables like OPENAI_API_KEY
to autoload that provider.
If there are additional providers you want to use you can submit a PR to the models.dev repo. If configuring just for yourself check out the Config section below
Project configuration is optional. You can place an opencode.json
file in the root of your repo and it will be loaded.
{
"$schema": "http://opencode.ai/config.json"
}
{
"$schema": "http://opencode.ai/config.json",
"mcp": {
"localmcp": {
"type": "local",
"command": ["bun", "x", "my-mcp-command"],
"environment": {
"MY_ENV_VAR": "my_env_var_value"
}
},
"remotemcp": {
"type": "remote",
"url": "https://my-mcp-server.com"
}
}
}
You can use opencode with any provider listed at here. Use the npm package name as the key in your config. Note we use v5 of the ai-sdk and not all providers support that yet.
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"@ai-sdk/openai-compatible": {
"name": "ollama",
"options": {
"baseURL": "http://localhost:11434/v1"
},
"models": {
"llama2": {
"name": "llama2"
}
}
}
}
}
To run opencode locally you need
- bun
- golang 1.24.x
To run
$ bun install
$ cd packages/opencode
$ bun run src/index.ts
OpenRouter is not yet in the models.dev database but you can configure it manually.
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"@openrouter/ai-sdk-provider": {
"name": "OpenRouter",
"options": {
"apiKey": "sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
},
"models": {
"anthropic/claude-3.5-sonnet": {
"name": "Claude 3.5 Sonnet"
}
}
}
}
}