1> [!NOTE]
2> This is the original OpenCode repository, now continuing at [Charm](https://github.com/charmbracelet) with its original creator, [Kujtim Hoxha](https://github.com/kujtimiihoxha).
3> Development is continuing under a new name as we prepare for a public relaunch.
4> Follow [@charmcli](https://x.com/charmcli) or join our [Discord](https://charm.sh/chat) for updates.
5
6# ⌬ OpenCode
7
8<p align="center"><img src="https://github.com/user-attachments/assets/9ae61ef6-70e5-4876-bc45-5bcb4e52c714" width="800"></p>
9
10> **⚠️ Early Development Notice:** This project is in early development and is not yet ready for production use. Features may change, break, or be incomplete. Use at your own risk.
11
12A powerful terminal-based AI assistant for developers, providing intelligent coding assistance directly in your terminal.
13
14## Overview
15
16OpenCode is a Go-based CLI application that brings AI assistance to your terminal. It provides a TUI (Terminal User Interface) for interacting with various AI models to help with coding tasks, debugging, and more.
17
18<p>For a quick video overview, check out
19<a href="https://www.youtube.com/watch?v=P8luPmEa1QI"><img width="25" src="https://upload.wikimedia.org/wikipedia/commons/0/09/YouTube_full-color_icon_%282017%29.svg"> OpenCode + Gemini 2.5 Pro: BYE Claude Code! I'm SWITCHING To the FASTEST AI Coder!</a></p>
20
21<a href="https://www.youtube.com/watch?v=P8luPmEa1QI"><img width="550" src="https://i3.ytimg.com/vi/P8luPmEa1QI/maxresdefault.jpg"></a><p>
22
23## Features
24
25- **Interactive TUI**: Built with [Bubble Tea](https://github.com/charmbracelet/bubbletea) for a smooth terminal experience
26- **Multiple AI Providers**: Support for OpenAI, Anthropic Claude, Google Gemini, AWS Bedrock, Groq, Azure OpenAI, and OpenRouter
27- **Session Management**: Save and manage multiple conversation sessions
28- **Tool Integration**: AI can execute commands, search files, and modify code
29- **Vim-like Editor**: Integrated editor with text input capabilities
30- **Persistent Storage**: SQLite database for storing conversations and sessions
31- **LSP Integration**: Language Server Protocol support for code intelligence
32- **File Change Tracking**: Track and visualize file changes during sessions
33- **External Editor Support**: Open your preferred editor for composing messages
34- **Named Arguments for Custom Commands**: Create powerful custom commands with multiple named placeholders
35
36## Installation
37
38### Using the Install Script
39
40```bash
41# Install the latest version
42curl -fsSL https://raw.githubusercontent.com/opencode-ai/opencode/refs/heads/main/install | bash
43
44# Install a specific version
45curl -fsSL https://raw.githubusercontent.com/opencode-ai/opencode/refs/heads/main/install | VERSION=0.1.0 bash
46```
47
48### Using Homebrew (macOS and Linux)
49
50```bash
51brew install opencode-ai/tap/opencode
52```
53
54### Using AUR (Arch Linux)
55
56```bash
57# Using yay
58yay -S opencode-ai-bin
59
60# Using paru
61paru -S opencode-ai-bin
62```
63
64### Using Go
65
66```bash
67go install github.com/opencode-ai/opencode@latest
68```
69
70## Configuration
71
72OpenCode looks for configuration in the following locations:
73
74- `$HOME/.opencode.json`
75- `$XDG_CONFIG_HOME/opencode/.opencode.json`
76- `./.opencode.json` (local directory)
77
78### Auto Compact Feature
79
80OpenCode includes an auto compact feature that automatically summarizes your conversation when it approaches the model's context window limit. When enabled (default setting), this feature:
81
82- Monitors token usage during your conversation
83- Automatically triggers summarization when usage reaches 95% of the model's context window
84- Creates a new session with the summary, allowing you to continue your work without losing context
85- Helps prevent "out of context" errors that can occur with long conversations
86
87You can enable or disable this feature in your configuration file:
88
89```json
90{
91 "autoCompact": true // default is true
92}
93```
94
95### Environment Variables
96
97You can configure OpenCode using environment variables:
98
99| Environment Variable | Purpose |
100| -------------------------- | -------------------------------------------------------------------------------- |
101| `ANTHROPIC_API_KEY` | For Claude models |
102| `OPENAI_API_KEY` | For OpenAI models |
103| `GEMINI_API_KEY` | For Google Gemini models |
104| `GITHUB_TOKEN` | For Github Copilot models (see [Using Github Copilot](#using-github-copilot)) |
105| `VERTEXAI_PROJECT` | For Google Cloud VertexAI (Gemini) |
106| `VERTEXAI_LOCATION` | For Google Cloud VertexAI (Gemini) |
107| `GROQ_API_KEY` | For Groq models |
108| `AWS_ACCESS_KEY_ID` | For AWS Bedrock (Claude) |
109| `AWS_SECRET_ACCESS_KEY` | For AWS Bedrock (Claude) |
110| `AWS_REGION` | For AWS Bedrock (Claude) |
111| `AZURE_OPENAI_ENDPOINT` | For Azure OpenAI models |
112| `AZURE_OPENAI_API_KEY` | For Azure OpenAI models (optional when using Entra ID) |
113| `AZURE_OPENAI_API_VERSION` | For Azure OpenAI models |
114| `LOCAL_ENDPOINT` | For self-hosted models |
115| `SHELL` | Default shell to use (if not specified in config) |
116
117### Shell Configuration
118
119OpenCode allows you to configure the shell used by the bash tool. By default, it uses the shell specified in the `SHELL` environment variable, or falls back to `/bin/bash` if not set.
120
121You can override this in your configuration file:
122
123```json
124{
125 "shell": {
126 "path": "/bin/zsh",
127 "args": ["-l"]
128 }
129}
130```
131
132This is useful if you want to use a different shell than your default system shell, or if you need to pass specific arguments to the shell.
133
134### Configuration File Structure
135
136```json
137{
138 "data": {
139 "directory": ".opencode"
140 },
141 "providers": {
142 "openai": {
143 "apiKey": "your-api-key",
144 "disabled": false
145 },
146 "anthropic": {
147 "apiKey": "your-api-key",
148 "disabled": false
149 },
150 "copilot": {
151 "disabled": false
152 },
153 "groq": {
154 "apiKey": "your-api-key",
155 "disabled": false
156 },
157 "openrouter": {
158 "apiKey": "your-api-key",
159 "disabled": false
160 }
161 },
162 "agents": {
163 "coder": {
164 "model": "claude-3.7-sonnet",
165 "maxTokens": 5000
166 },
167 "task": {
168 "model": "claude-3.7-sonnet",
169 "maxTokens": 5000
170 },
171 "title": {
172 "model": "claude-3.7-sonnet",
173 "maxTokens": 80
174 }
175 },
176 "shell": {
177 "path": "/bin/bash",
178 "args": ["-l"]
179 },
180 "mcpServers": {
181 "example": {
182 "type": "stdio",
183 "command": "path/to/mcp-server",
184 "env": [],
185 "args": []
186 }
187 },
188 "lsp": {
189 "go": {
190 "disabled": false,
191 "command": "gopls"
192 }
193 },
194 "debug": false,
195 "debugLSP": false,
196 "autoCompact": true
197}
198```
199
200## Supported AI Models
201
202OpenCode supports a variety of AI models from different providers:
203
204### OpenAI
205
206- GPT-4.1 family (gpt-4.1, gpt-4.1-mini, gpt-4.1-nano)
207- GPT-4.5 Preview
208- GPT-4o family (gpt-4o, gpt-4o-mini)
209- O1 family (o1, o1-pro, o1-mini)
210- O3 family (o3, o3-mini)
211- O4 Mini
212
213### Anthropic
214
215- Claude 4 Sonnet
216- Claude 4 Opus
217- Claude 3.5 Sonnet
218- Claude 3.5 Haiku
219- Claude 3.7 Sonnet
220- Claude 3 Haiku
221- Claude 3 Opus
222
223### GitHub Copilot
224
225- GPT-3.5 Turbo
226- GPT-4
227- GPT-4o
228- GPT-4o Mini
229- GPT-4.1
230- Claude 3.5 Sonnet
231- Claude 3.7 Sonnet
232- Claude 3.7 Sonnet Thinking
233- Claude Sonnet 4
234- O1
235- O3 Mini
236- O4 Mini
237- Gemini 2.0 Flash
238- Gemini 2.5 Pro
239
240### Google
241
242- Gemini 2.5
243- Gemini 2.5 Flash
244- Gemini 2.0 Flash
245- Gemini 2.0 Flash Lite
246
247### AWS Bedrock
248
249- Claude 3.7 Sonnet
250
251### Groq
252
253- Llama 4 Maverick (17b-128e-instruct)
254- Llama 4 Scout (17b-16e-instruct)
255- QWEN QWQ-32b
256- Deepseek R1 distill Llama 70b
257- Llama 3.3 70b Versatile
258
259### Azure OpenAI
260
261- GPT-4.1 family (gpt-4.1, gpt-4.1-mini, gpt-4.1-nano)
262- GPT-4.5 Preview
263- GPT-4o family (gpt-4o, gpt-4o-mini)
264- O1 family (o1, o1-mini)
265- O3 family (o3, o3-mini)
266- O4 Mini
267
268### Google Cloud VertexAI
269
270- Gemini 2.5
271- Gemini 2.5 Flash
272
273## Usage
274
275```bash
276# Start OpenCode
277opencode
278
279# Start with debug logging
280opencode -d
281
282# Start with a specific working directory
283opencode -c /path/to/project
284```
285
286## Non-interactive Prompt Mode
287
288You can run OpenCode in non-interactive mode by passing a prompt directly as a command-line argument. This is useful for scripting, automation, or when you want a quick answer without launching the full TUI.
289
290```bash
291# Run a single prompt and print the AI's response to the terminal
292opencode -p "Explain the use of context in Go"
293
294# Get response in JSON format
295opencode -p "Explain the use of context in Go" -f json
296
297# Run without showing the spinner (useful for scripts)
298opencode -p "Explain the use of context in Go" -q
299```
300
301In this mode, OpenCode will process your prompt, print the result to standard output, and then exit. All permissions are auto-approved for the session.
302
303By default, a spinner animation is displayed while the model is processing your query. You can disable this spinner with the `-q` or `--quiet` flag, which is particularly useful when running OpenCode from scripts or automated workflows.
304
305### Output Formats
306
307OpenCode supports the following output formats in non-interactive mode:
308
309| Format | Description |
310| ------ | ------------------------------- |
311| `text` | Plain text output (default) |
312| `json` | Output wrapped in a JSON object |
313
314The output format is implemented as a strongly-typed `OutputFormat` in the codebase, ensuring type safety and validation when processing outputs.
315
316## Command-line Flags
317
318| Flag | Short | Description |
319| ----------------- | ----- | --------------------------------------------------- |
320| `--help` | `-h` | Display help information |
321| `--debug` | `-d` | Enable debug mode |
322| `--cwd` | `-c` | Set current working directory |
323| `--prompt` | `-p` | Run a single prompt in non-interactive mode |
324| `--output-format` | `-f` | Output format for non-interactive mode (text, json) |
325| `--quiet` | `-q` | Hide spinner in non-interactive mode |
326
327## Keyboard Shortcuts
328
329### Global Shortcuts
330
331| Shortcut | Action |
332| -------- | ------------------------------------------------------- |
333| `Ctrl+C` | Quit application |
334| `Ctrl+?` | Toggle help dialog |
335| `?` | Toggle help dialog (when not in editing mode) |
336| `Ctrl+L` | View logs |
337| `Ctrl+A` | Switch session |
338| `Ctrl+K` | Command dialog |
339| `Ctrl+O` | Toggle model selection dialog |
340| `Esc` | Close current overlay/dialog or return to previous mode |
341
342### Chat Page Shortcuts
343
344| Shortcut | Action |
345| -------- | --------------------------------------- |
346| `Ctrl+N` | Create new session |
347| `Ctrl+X` | Cancel current operation/generation |
348| `i` | Focus editor (when not in writing mode) |
349| `Esc` | Exit writing mode and focus messages |
350
351### Editor Shortcuts
352
353| Shortcut | Action |
354| ------------------- | ----------------------------------------- |
355| `Ctrl+S` | Send message (when editor is focused) |
356| `Enter` or `Ctrl+S` | Send message (when editor is not focused) |
357| `Ctrl+E` | Open external editor |
358| `Esc` | Blur editor and focus messages |
359
360### Session Dialog Shortcuts
361
362| Shortcut | Action |
363| ---------- | ---------------- |
364| `↑` or `k` | Previous session |
365| `↓` or `j` | Next session |
366| `Enter` | Select session |
367| `Esc` | Close dialog |
368
369### Model Dialog Shortcuts
370
371| Shortcut | Action |
372| ---------- | ----------------- |
373| `↑` or `k` | Move up |
374| `↓` or `j` | Move down |
375| `←` or `h` | Previous provider |
376| `→` or `l` | Next provider |
377| `Esc` | Close dialog |
378
379### Permission Dialog Shortcuts
380
381| Shortcut | Action |
382| ----------------------- | ---------------------------- |
383| `←` or `left` | Switch options left |
384| `→` or `right` or `tab` | Switch options right |
385| `Enter` or `space` | Confirm selection |
386| `a` | Allow permission |
387| `A` | Allow permission for session |
388| `d` | Deny permission |
389
390### Logs Page Shortcuts
391
392| Shortcut | Action |
393| ------------------ | ------------------- |
394| `Backspace` or `q` | Return to chat page |
395
396## AI Assistant Tools
397
398OpenCode's AI assistant has access to various tools to help with coding tasks:
399
400### File and Code Tools
401
402| Tool | Description | Parameters |
403| ------------- | --------------------------- | ---------------------------------------------------------------------------------------- |
404| `glob` | Find files by pattern | `pattern` (required), `path` (optional) |
405| `grep` | Search file contents | `pattern` (required), `path` (optional), `include` (optional), `literal_text` (optional) |
406| `ls` | List directory contents | `path` (optional), `ignore` (optional array of patterns) |
407| `view` | View file contents | `file_path` (required), `offset` (optional), `limit` (optional) |
408| `write` | Write to files | `file_path` (required), `content` (required) |
409| `edit` | Edit files | Various parameters for file editing |
410| `patch` | Apply patches to files | `file_path` (required), `diff` (required) |
411| `diagnostics` | Get diagnostics information | `file_path` (optional) |
412
413### Other Tools
414
415| Tool | Description | Parameters |
416| ------------- | -------------------------------------- | ----------------------------------------------------------------------------------------- |
417| `bash` | Execute shell commands | `command` (required), `timeout` (optional) |
418| `fetch` | Fetch data from URLs | `url` (required), `format` (required), `timeout` (optional) |
419| `sourcegraph` | Search code across public repositories | `query` (required), `count` (optional), `context_window` (optional), `timeout` (optional) |
420| `agent` | Run sub-tasks with the AI agent | `prompt` (required) |
421
422## Architecture
423
424OpenCode is built with a modular architecture:
425
426- **cmd**: Command-line interface using Cobra
427- **internal/app**: Core application services
428- **internal/config**: Configuration management
429- **internal/db**: Database operations and migrations
430- **internal/llm**: LLM providers and tools integration
431- **internal/tui**: Terminal UI components and layouts
432- **internal/logging**: Logging infrastructure
433- **internal/message**: Message handling
434- **internal/session**: Session management
435- **internal/lsp**: Language Server Protocol integration
436
437## Custom Commands
438
439OpenCode supports custom commands that can be created by users to quickly send predefined prompts to the AI assistant.
440
441### Creating Custom Commands
442
443Custom commands are predefined prompts stored as Markdown files in one of three locations:
444
4451. **User Commands** (prefixed with `user:`):
446
447 ```
448 $XDG_CONFIG_HOME/opencode/commands/
449 ```
450
451 (typically `~/.config/opencode/commands/` on Linux/macOS)
452
453 or
454
455 ```
456 $HOME/.opencode/commands/
457 ```
458
4592. **Project Commands** (prefixed with `project:`):
460
461 ```
462 <PROJECT DIR>/.opencode/commands/
463 ```
464
465Each `.md` file in these directories becomes a custom command. The file name (without extension) becomes the command ID.
466
467For example, creating a file at `~/.config/opencode/commands/prime-context.md` with content:
468
469```markdown
470RUN git ls-files
471READ README.md
472```
473
474This creates a command called `user:prime-context`.
475
476### Command Arguments
477
478OpenCode supports named arguments in custom commands using placeholders in the format `$NAME` (where NAME consists of uppercase letters, numbers, and underscores, and must start with a letter).
479
480For example:
481
482```markdown
483# Fetch Context for Issue $ISSUE_NUMBER
484
485RUN gh issue view $ISSUE_NUMBER --json title,body,comments
486RUN git grep --author="$AUTHOR_NAME" -n .
487RUN grep -R "$SEARCH_PATTERN" $DIRECTORY
488```
489
490When you run a command with arguments, OpenCode will prompt you to enter values for each unique placeholder. Named arguments provide several benefits:
491
492- Clear identification of what each argument represents
493- Ability to use the same argument multiple times
494- Better organization for commands with multiple inputs
495
496### Organizing Commands
497
498You can organize commands in subdirectories:
499
500```
501~/.config/opencode/commands/git/commit.md
502```
503
504This creates a command with ID `user:git:commit`.
505
506### Using Custom Commands
507
5081. Press `Ctrl+K` to open the command dialog
5092. Select your custom command (prefixed with either `user:` or `project:`)
5103. Press Enter to execute the command
511
512The content of the command file will be sent as a message to the AI assistant.
513
514### Built-in Commands
515
516OpenCode includes several built-in commands:
517
518| Command | Description |
519| ------------------ | --------------------------------------------------------------------------------------------------- |
520| Initialize Project | Creates or updates the OpenCode.md memory file with project-specific information |
521| Compact Session | Manually triggers the summarization of the current session, creating a new session with the summary |
522
523## MCP (Model Context Protocol)
524
525OpenCode implements the Model Context Protocol (MCP) to extend its capabilities through external tools. MCP provides a standardized way for the AI assistant to interact with external services and tools.
526
527### MCP Features
528
529- **External Tool Integration**: Connect to external tools and services via a standardized protocol
530- **Tool Discovery**: Automatically discover available tools from MCP servers
531- **Multiple Connection Types**:
532 - **Stdio**: Communicate with tools via standard input/output
533 - **SSE**: Communicate with tools via Server-Sent Events
534- **Security**: Permission system for controlling access to MCP tools
535
536### Configuring MCP Servers
537
538MCP servers are defined in the configuration file under the `mcpServers` section:
539
540```json
541{
542 "mcpServers": {
543 "example": {
544 "type": "stdio",
545 "command": "path/to/mcp-server",
546 "env": [],
547 "args": []
548 },
549 "web-example": {
550 "type": "sse",
551 "url": "https://example.com/mcp",
552 "headers": {
553 "Authorization": "Bearer token"
554 }
555 }
556 }
557}
558```
559
560### MCP Tool Usage
561
562Once configured, MCP tools are automatically available to the AI assistant alongside built-in tools. They follow the same permission model as other tools, requiring user approval before execution.
563
564## LSP (Language Server Protocol)
565
566OpenCode integrates with Language Server Protocol to provide code intelligence features across multiple programming languages.
567
568### LSP Features
569
570- **Multi-language Support**: Connect to language servers for different programming languages
571- **Diagnostics**: Receive error checking and linting information
572- **File Watching**: Automatically notify language servers of file changes
573
574### Configuring LSP
575
576Language servers are configured in the configuration file under the `lsp` section:
577
578```json
579{
580 "lsp": {
581 "go": {
582 "disabled": false,
583 "command": "gopls"
584 },
585 "typescript": {
586 "disabled": false,
587 "command": "typescript-language-server",
588 "args": ["--stdio"]
589 }
590 }
591}
592```
593
594### LSP Integration with AI
595
596The AI assistant can access LSP features through the `diagnostics` tool, allowing it to:
597
598- Check for errors in your code
599- Suggest fixes based on diagnostics
600
601While the LSP client implementation supports the full LSP protocol (including completions, hover, definition, etc.), currently only diagnostics are exposed to the AI assistant.
602
603## Using Github Copilot
604
605_Copilot support is currently experimental._
606
607### Requirements
608- [Copilot chat in the IDE](https://github.com/settings/copilot) enabled in GitHub settings
609- One of:
610 - VSCode Github Copilot chat extension
611 - Github `gh` CLI
612 - Neovim Github Copilot plugin (`copilot.vim` or `copilot.lua`)
613 - Github token with copilot permissions
614
615If using one of the above plugins or cli tools, make sure you use the authenticate
616the tool with your github account. This should create a github token at one of the following locations:
617- ~/.config/github-copilot/[hosts,apps].json
618- $XDG_CONFIG_HOME/github-copilot/[hosts,apps].json
619
620If using an explicit github token, you may either set the $GITHUB_TOKEN environment variable or add it to the opencode.json config file at `providers.copilot.apiKey`.
621
622## Using a self-hosted model provider
623
624OpenCode can also load and use models from a self-hosted (OpenAI-like) provider.
625This is useful for developers who want to experiment with custom models.
626
627### Configuring a self-hosted provider
628
629You can use a self-hosted model by setting the `LOCAL_ENDPOINT` environment variable.
630This will cause OpenCode to load and use the models from the specified endpoint.
631
632```bash
633LOCAL_ENDPOINT=http://localhost:1235/v1
634```
635
636### Configuring a self-hosted model
637
638You can also configure a self-hosted model in the configuration file under the `agents` section:
639
640```json
641{
642 "agents": {
643 "coder": {
644 "model": "local.granite-3.3-2b-instruct@q8_0",
645 "reasoningEffort": "high"
646 }
647 }
648}
649```
650
651## Development
652
653### Prerequisites
654
655- Go 1.24.0 or higher
656
657### Building from Source
658
659```bash
660# Clone the repository
661git clone https://github.com/opencode-ai/opencode.git
662cd opencode
663
664# Build
665go build -o opencode
666
667# Run
668./opencode
669```
670
671## Acknowledgments
672
673OpenCode gratefully acknowledges the contributions and support from these key individuals:
674
675- [@isaacphi](https://github.com/isaacphi) - For the [mcp-language-server](https://github.com/isaacphi/mcp-language-server) project which provided the foundation for our LSP client implementation
676- [@adamdottv](https://github.com/adamdottv) - For the design direction and UI/UX architecture
677
678Special thanks to the broader open source community whose tools and libraries have made this project possible.
679
680## License
681
682OpenCode is licensed under the MIT License. See the [LICENSE](LICENSE) file for details.
683
684## Contributing
685
686Contributions are welcome! Here's how you can contribute:
687
6881. Fork the repository
6892. Create a feature branch (`git checkout -b feature/amazing-feature`)
6903. Commit your changes (`git commit -m 'Add some amazing feature'`)
6914. Push to the branch (`git push origin feature/amazing-feature`)
6925. Open a Pull Request
693
694Please make sure to update tests as appropriate and follow the existing code style.