README.md

  1# ⌬ OpenCode
  2
  3<p align="center"><img src="https://github.com/user-attachments/assets/9ae61ef6-70e5-4876-bc45-5bcb4e52c714" width="800"></p>
  4
  5> **⚠️ Early Development Notice:** This project is in early development and is not yet ready for production use. Features may change, break, or be incomplete. Use at your own risk.
  6
  7A powerful terminal-based AI assistant for developers, providing intelligent coding assistance directly in your terminal.
  8
  9## Overview
 10
 11OpenCode is a Go-based CLI application that brings AI assistance to your terminal. It provides a TUI (Terminal User Interface) for interacting with various AI models to help with coding tasks, debugging, and more.
 12
 13<p>For a quick video overview, check out
 14<a href="https://www.youtube.com/watch?v=P8luPmEa1QI"><img width="25" src="https://upload.wikimedia.org/wikipedia/commons/0/09/YouTube_full-color_icon_%282017%29.svg"> OpenCode + Gemini 2.5 Pro: BYE Claude Code! I'm SWITCHING To the FASTEST AI Coder!</a></p>
 15
 16<a href="https://www.youtube.com/watch?v=P8luPmEa1QI"><img width="550" src="https://i3.ytimg.com/vi/P8luPmEa1QI/maxresdefault.jpg"></a><p>
 17
 18## Features
 19
 20- **Interactive TUI**: Built with [Bubble Tea](https://github.com/charmbracelet/bubbletea) for a smooth terminal experience
 21- **Multiple AI Providers**: Support for OpenAI, Anthropic Claude, Google Gemini, AWS Bedrock, Groq, Azure OpenAI, and OpenRouter
 22- **Session Management**: Save and manage multiple conversation sessions
 23- **Tool Integration**: AI can execute commands, search files, and modify code
 24- **Vim-like Editor**: Integrated editor with text input capabilities
 25- **Persistent Storage**: SQLite database for storing conversations and sessions
 26- **LSP Integration**: Language Server Protocol support for code intelligence
 27- **File Change Tracking**: Track and visualize file changes during sessions
 28- **External Editor Support**: Open your preferred editor for composing messages
 29- **Named Arguments for Custom Commands**: Create powerful custom commands with multiple named placeholders
 30
 31## Installation
 32
 33### Using the Install Script
 34
 35```bash
 36# Install the latest version
 37curl -fsSL https://raw.githubusercontent.com/opencode-ai/opencode/refs/heads/main/install | bash
 38
 39# Install a specific version
 40curl -fsSL https://raw.githubusercontent.com/opencode-ai/opencode/refs/heads/main/install | VERSION=0.1.0 bash
 41```
 42
 43### Using Homebrew (macOS and Linux)
 44
 45```bash
 46brew install opencode-ai/tap/opencode
 47```
 48
 49### Using AUR (Arch Linux)
 50
 51```bash
 52# Using yay
 53yay -S opencode-ai-bin
 54
 55# Using paru
 56paru -S opencode-ai-bin
 57```
 58
 59### Using Go
 60
 61```bash
 62go install github.com/opencode-ai/opencode@latest
 63```
 64
 65## Configuration
 66
 67OpenCode looks for configuration in the following locations:
 68
 69- `$HOME/.opencode.json`
 70- `$XDG_CONFIG_HOME/opencode/.opencode.json`
 71- `./.opencode.json` (local directory)
 72
 73### Auto Compact Feature
 74
 75OpenCode includes an auto compact feature that automatically summarizes your conversation when it approaches the model's context window limit. When enabled (default setting), this feature:
 76
 77- Monitors token usage during your conversation
 78- Automatically triggers summarization when usage reaches 95% of the model's context window
 79- Creates a new session with the summary, allowing you to continue your work without losing context
 80- Helps prevent "out of context" errors that can occur with long conversations
 81
 82You can enable or disable this feature in your configuration file:
 83
 84```json
 85{
 86  "autoCompact": true // default is true
 87}
 88```
 89
 90### Environment Variables
 91
 92You can configure OpenCode using environment variables:
 93
 94| Environment Variable       | Purpose                                                |
 95| -------------------------- | ------------------------------------------------------ |
 96| `ANTHROPIC_API_KEY`        | For Claude models                                      |
 97| `OPENAI_API_KEY`           | For OpenAI models                                      |
 98| `GEMINI_API_KEY`           | For Google Gemini models                               |
 99| `VERTEXAI_PROJECT`         | For Google Cloud VertexAI (Gemini)                     |
100| `VERTEXAI_LOCATION`        | For Google Cloud VertexAI (Gemini)                     |
101| `GROQ_API_KEY`             | For Groq models                                        |
102| `AWS_ACCESS_KEY_ID`        | For AWS Bedrock (Claude)                               |
103| `AWS_SECRET_ACCESS_KEY`    | For AWS Bedrock (Claude)                               |
104| `AWS_REGION`               | For AWS Bedrock (Claude)                               |
105| `AZURE_OPENAI_ENDPOINT`    | For Azure OpenAI models                                |
106| `AZURE_OPENAI_API_KEY`     | For Azure OpenAI models (optional when using Entra ID) |
107| `AZURE_OPENAI_API_VERSION` | For Azure OpenAI models                                |
108| `LOCAL_ENDPOINT`           | For self-hosted models                                 |
109| `SHELL`                    | Default shell to use (if not specified in config)      |
110
111### Shell Configuration
112
113OpenCode allows you to configure the shell used by the bash tool. By default, it uses the shell specified in the `SHELL` environment variable, or falls back to `/bin/bash` if not set.
114
115You can override this in your configuration file:
116
117```json
118{
119  "shell": {
120    "path": "/bin/zsh",
121    "args": ["-l"]
122  }
123}
124```
125
126This is useful if you want to use a different shell than your default system shell, or if you need to pass specific arguments to the shell.
127
128### Configuration File Structure
129
130```json
131{
132  "data": {
133    "directory": ".opencode"
134  },
135  "providers": {
136    "openai": {
137      "apiKey": "your-api-key",
138      "disabled": false
139    },
140    "anthropic": {
141      "apiKey": "your-api-key",
142      "disabled": false
143    },
144    "groq": {
145      "apiKey": "your-api-key",
146      "disabled": false
147    },
148    "openrouter": {
149      "apiKey": "your-api-key",
150      "disabled": false
151    }
152  },
153  "agents": {
154    "coder": {
155      "model": "claude-3.7-sonnet",
156      "maxTokens": 5000
157    },
158    "task": {
159      "model": "claude-3.7-sonnet",
160      "maxTokens": 5000
161    },
162    "title": {
163      "model": "claude-3.7-sonnet",
164      "maxTokens": 80
165    }
166  },
167  "shell": {
168    "path": "/bin/bash",
169    "args": ["-l"]
170  },
171  "mcpServers": {
172    "example": {
173      "type": "stdio",
174      "command": "path/to/mcp-server",
175      "env": [],
176      "args": []
177    }
178  },
179  "lsp": {
180    "go": {
181      "disabled": false,
182      "command": "gopls"
183    }
184  },
185  "debug": false,
186  "debugLSP": false,
187  "autoCompact": true
188}
189```
190
191## Supported AI Models
192
193OpenCode supports a variety of AI models from different providers:
194
195### OpenAI
196
197- GPT-4.1 family (gpt-4.1, gpt-4.1-mini, gpt-4.1-nano)
198- GPT-4.5 Preview
199- GPT-4o family (gpt-4o, gpt-4o-mini)
200- O1 family (o1, o1-pro, o1-mini)
201- O3 family (o3, o3-mini)
202- O4 Mini
203
204### Anthropic
205
206- Claude 3.5 Sonnet
207- Claude 3.5 Haiku
208- Claude 3.7 Sonnet
209- Claude 3 Haiku
210- Claude 3 Opus
211
212### Google
213
214- Gemini 2.5
215- Gemini 2.5 Flash
216- Gemini 2.0 Flash
217- Gemini 2.0 Flash Lite
218
219### AWS Bedrock
220
221- Claude 3.7 Sonnet
222
223### Groq
224
225- Llama 4 Maverick (17b-128e-instruct)
226- Llama 4 Scout (17b-16e-instruct)
227- QWEN QWQ-32b
228- Deepseek R1 distill Llama 70b
229- Llama 3.3 70b Versatile
230
231### Azure OpenAI
232
233- GPT-4.1 family (gpt-4.1, gpt-4.1-mini, gpt-4.1-nano)
234- GPT-4.5 Preview
235- GPT-4o family (gpt-4o, gpt-4o-mini)
236- O1 family (o1, o1-mini)
237- O3 family (o3, o3-mini)
238- O4 Mini
239
240### Google Cloud VertexAI
241
242- Gemini 2.5
243- Gemini 2.5 Flash
244
245## Usage
246
247```bash
248# Start OpenCode
249opencode
250
251# Start with debug logging
252opencode -d
253
254# Start with a specific working directory
255opencode -c /path/to/project
256```
257
258## Non-interactive Prompt Mode
259
260You can run OpenCode in non-interactive mode by passing a prompt directly as a command-line argument. This is useful for scripting, automation, or when you want a quick answer without launching the full TUI.
261
262```bash
263# Run a single prompt and print the AI's response to the terminal
264opencode -p "Explain the use of context in Go"
265
266# Get response in JSON format
267opencode -p "Explain the use of context in Go" -f json
268
269# Run without showing the spinner (useful for scripts)
270opencode -p "Explain the use of context in Go" -q
271```
272
273In this mode, OpenCode will process your prompt, print the result to standard output, and then exit. All permissions are auto-approved for the session.
274
275By default, a spinner animation is displayed while the model is processing your query. You can disable this spinner with the `-q` or `--quiet` flag, which is particularly useful when running OpenCode from scripts or automated workflows.
276
277### Output Formats
278
279OpenCode supports the following output formats in non-interactive mode:
280
281| Format | Description                     |
282| ------ | ------------------------------- |
283| `text` | Plain text output (default)     |
284| `json` | Output wrapped in a JSON object |
285
286The output format is implemented as a strongly-typed `OutputFormat` in the codebase, ensuring type safety and validation when processing outputs.
287
288## Command-line Flags
289
290| Flag              | Short | Description                                         |
291| ----------------- | ----- | --------------------------------------------------- |
292| `--help`          | `-h`  | Display help information                            |
293| `--debug`         | `-d`  | Enable debug mode                                   |
294| `--cwd`           | `-c`  | Set current working directory                       |
295| `--prompt`        | `-p`  | Run a single prompt in non-interactive mode         |
296| `--output-format` | `-f`  | Output format for non-interactive mode (text, json) |
297| `--quiet`         | `-q`  | Hide spinner in non-interactive mode                |
298
299## Keyboard Shortcuts
300
301### Global Shortcuts
302
303| Shortcut | Action                                                  |
304| -------- | ------------------------------------------------------- |
305| `Ctrl+C` | Quit application                                        |
306| `Ctrl+?` | Toggle help dialog                                      |
307| `?`      | Toggle help dialog (when not in editing mode)           |
308| `Ctrl+L` | View logs                                               |
309| `Ctrl+A` | Switch session                                          |
310| `Ctrl+K` | Command dialog                                          |
311| `Ctrl+O` | Toggle model selection dialog                           |
312| `Esc`    | Close current overlay/dialog or return to previous mode |
313
314### Chat Page Shortcuts
315
316| Shortcut | Action                                  |
317| -------- | --------------------------------------- |
318| `Ctrl+N` | Create new session                      |
319| `Ctrl+X` | Cancel current operation/generation     |
320| `i`      | Focus editor (when not in writing mode) |
321| `Esc`    | Exit writing mode and focus messages    |
322
323### Editor Shortcuts
324
325| Shortcut            | Action                                    |
326| ------------------- | ----------------------------------------- |
327| `Ctrl+S`            | Send message (when editor is focused)     |
328| `Enter` or `Ctrl+S` | Send message (when editor is not focused) |
329| `Ctrl+E`            | Open external editor                      |
330| `Esc`               | Blur editor and focus messages            |
331
332### Session Dialog Shortcuts
333
334| Shortcut   | Action           |
335| ---------- | ---------------- |
336| `↑` or `k` | Previous session |
337| `↓` or `j` | Next session     |
338| `Enter`    | Select session   |
339| `Esc`      | Close dialog     |
340
341### Model Dialog Shortcuts
342
343| Shortcut   | Action            |
344| ---------- | ----------------- |
345| `↑` or `k` | Move up           |
346| `↓` or `j` | Move down         |
347| `←` or `h` | Previous provider |
348| `→` or `l` | Next provider     |
349| `Esc`      | Close dialog      |
350
351### Permission Dialog Shortcuts
352
353| Shortcut                | Action                       |
354| ----------------------- | ---------------------------- |
355| `←` or `left`           | Switch options left          |
356| `→` or `right` or `tab` | Switch options right         |
357| `Enter` or `space`      | Confirm selection            |
358| `a`                     | Allow permission             |
359| `A`                     | Allow permission for session |
360| `d`                     | Deny permission              |
361
362### Logs Page Shortcuts
363
364| Shortcut           | Action              |
365| ------------------ | ------------------- |
366| `Backspace` or `q` | Return to chat page |
367
368## AI Assistant Tools
369
370OpenCode's AI assistant has access to various tools to help with coding tasks:
371
372### File and Code Tools
373
374| Tool          | Description                 | Parameters                                                                               |
375| ------------- | --------------------------- | ---------------------------------------------------------------------------------------- |
376| `glob`        | Find files by pattern       | `pattern` (required), `path` (optional)                                                  |
377| `grep`        | Search file contents        | `pattern` (required), `path` (optional), `include` (optional), `literal_text` (optional) |
378| `ls`          | List directory contents     | `path` (optional), `ignore` (optional array of patterns)                                 |
379| `view`        | View file contents          | `file_path` (required), `offset` (optional), `limit` (optional)                          |
380| `write`       | Write to files              | `file_path` (required), `content` (required)                                             |
381| `edit`        | Edit files                  | Various parameters for file editing                                                      |
382| `patch`       | Apply patches to files      | `file_path` (required), `diff` (required)                                                |
383| `diagnostics` | Get diagnostics information | `file_path` (optional)                                                                   |
384
385### Other Tools
386
387| Tool          | Description                            | Parameters                                                                                |
388| ------------- | -------------------------------------- | ----------------------------------------------------------------------------------------- |
389| `bash`        | Execute shell commands                 | `command` (required), `timeout` (optional)                                                |
390| `fetch`       | Fetch data from URLs                   | `url` (required), `format` (required), `timeout` (optional)                               |
391| `sourcegraph` | Search code across public repositories | `query` (required), `count` (optional), `context_window` (optional), `timeout` (optional) |
392| `agent`       | Run sub-tasks with the AI agent        | `prompt` (required)                                                                       |
393
394## Architecture
395
396OpenCode is built with a modular architecture:
397
398- **cmd**: Command-line interface using Cobra
399- **internal/app**: Core application services
400- **internal/config**: Configuration management
401- **internal/db**: Database operations and migrations
402- **internal/llm**: LLM providers and tools integration
403- **internal/tui**: Terminal UI components and layouts
404- **internal/logging**: Logging infrastructure
405- **internal/message**: Message handling
406- **internal/session**: Session management
407- **internal/lsp**: Language Server Protocol integration
408
409## Custom Commands
410
411OpenCode supports custom commands that can be created by users to quickly send predefined prompts to the AI assistant.
412
413### Creating Custom Commands
414
415Custom commands are predefined prompts stored as Markdown files in one of three locations:
416
4171. **User Commands** (prefixed with `user:`):
418
419   ```
420   $XDG_CONFIG_HOME/opencode/commands/
421   ```
422
423   (typically `~/.config/opencode/commands/` on Linux/macOS)
424
425   or
426
427   ```
428   $HOME/.opencode/commands/
429   ```
430
4312. **Project Commands** (prefixed with `project:`):
432
433   ```
434   <PROJECT DIR>/.opencode/commands/
435   ```
436
437Each `.md` file in these directories becomes a custom command. The file name (without extension) becomes the command ID.
438
439For example, creating a file at `~/.config/opencode/commands/prime-context.md` with content:
440
441```markdown
442RUN git ls-files
443READ README.md
444```
445
446This creates a command called `user:prime-context`.
447
448### Command Arguments
449
450OpenCode supports named arguments in custom commands using placeholders in the format `$NAME` (where NAME consists of uppercase letters, numbers, and underscores, and must start with a letter).
451
452For example:
453
454```markdown
455# Fetch Context for Issue $ISSUE_NUMBER
456
457RUN gh issue view $ISSUE_NUMBER --json title,body,comments
458RUN git grep --author="$AUTHOR_NAME" -n .
459RUN grep -R "$SEARCH_PATTERN" $DIRECTORY
460```
461
462When you run a command with arguments, OpenCode will prompt you to enter values for each unique placeholder. Named arguments provide several benefits:
463
464- Clear identification of what each argument represents
465- Ability to use the same argument multiple times
466- Better organization for commands with multiple inputs
467
468### Organizing Commands
469
470You can organize commands in subdirectories:
471
472```
473~/.config/opencode/commands/git/commit.md
474```
475
476This creates a command with ID `user:git:commit`.
477
478### Using Custom Commands
479
4801. Press `Ctrl+K` to open the command dialog
4812. Select your custom command (prefixed with either `user:` or `project:`)
4823. Press Enter to execute the command
483
484The content of the command file will be sent as a message to the AI assistant.
485
486### Built-in Commands
487
488OpenCode includes several built-in commands:
489
490| Command            | Description                                                                                         |
491| ------------------ | --------------------------------------------------------------------------------------------------- |
492| Initialize Project | Creates or updates the OpenCode.md memory file with project-specific information                    |
493| Compact Session    | Manually triggers the summarization of the current session, creating a new session with the summary |
494
495## MCP (Model Context Protocol)
496
497OpenCode implements the Model Context Protocol (MCP) to extend its capabilities through external tools. MCP provides a standardized way for the AI assistant to interact with external services and tools.
498
499### MCP Features
500
501- **External Tool Integration**: Connect to external tools and services via a standardized protocol
502- **Tool Discovery**: Automatically discover available tools from MCP servers
503- **Multiple Connection Types**:
504  - **Stdio**: Communicate with tools via standard input/output
505  - **SSE**: Communicate with tools via Server-Sent Events
506- **Security**: Permission system for controlling access to MCP tools
507
508### Configuring MCP Servers
509
510MCP servers are defined in the configuration file under the `mcpServers` section:
511
512```json
513{
514  "mcpServers": {
515    "example": {
516      "type": "stdio",
517      "command": "path/to/mcp-server",
518      "env": [],
519      "args": []
520    },
521    "web-example": {
522      "type": "sse",
523      "url": "https://example.com/mcp",
524      "headers": {
525        "Authorization": "Bearer token"
526      }
527    }
528  }
529}
530```
531
532### MCP Tool Usage
533
534Once configured, MCP tools are automatically available to the AI assistant alongside built-in tools. They follow the same permission model as other tools, requiring user approval before execution.
535
536## LSP (Language Server Protocol)
537
538OpenCode integrates with Language Server Protocol to provide code intelligence features across multiple programming languages.
539
540### LSP Features
541
542- **Multi-language Support**: Connect to language servers for different programming languages
543- **Diagnostics**: Receive error checking and linting information
544- **File Watching**: Automatically notify language servers of file changes
545
546### Configuring LSP
547
548Language servers are configured in the configuration file under the `lsp` section:
549
550```json
551{
552  "lsp": {
553    "go": {
554      "disabled": false,
555      "command": "gopls"
556    },
557    "typescript": {
558      "disabled": false,
559      "command": "typescript-language-server",
560      "args": ["--stdio"]
561    }
562  }
563}
564```
565
566### LSP Integration with AI
567
568The AI assistant can access LSP features through the `diagnostics` tool, allowing it to:
569
570- Check for errors in your code
571- Suggest fixes based on diagnostics
572
573While the LSP client implementation supports the full LSP protocol (including completions, hover, definition, etc.), currently only diagnostics are exposed to the AI assistant.
574
575## Using a self-hosted model provider
576
577OpenCode can also load and use models from a self-hosted (OpenAI-like) provider.
578This is useful for developers who want to experiment with custom models.
579
580### Configuring a self-hosted provider
581
582You can use a self-hosted model by setting the `LOCAL_ENDPOINT` environment variable.
583This will cause OpenCode to load and use the models from the specified endpoint.
584
585```bash
586LOCAL_ENDPOINT=http://localhost:1235/v1
587```
588
589### Configuring a self-hosted model
590
591You can also configure a self-hosted model in the configuration file under the `agents` section:
592
593```json
594{
595  "agents": {
596    "coder": {
597      "model": "local.granite-3.3-2b-instruct@q8_0",
598      "reasoningEffort": "high"
599    }
600  }
601}
602```
603
604## Development
605
606### Prerequisites
607
608- Go 1.24.0 or higher
609
610### Building from Source
611
612```bash
613# Clone the repository
614git clone https://github.com/opencode-ai/opencode.git
615cd opencode
616
617# Build
618go build -o opencode
619
620# Run
621./opencode
622```
623
624## Acknowledgments
625
626OpenCode gratefully acknowledges the contributions and support from these key individuals:
627
628- [@isaacphi](https://github.com/isaacphi) - For the [mcp-language-server](https://github.com/isaacphi/mcp-language-server) project which provided the foundation for our LSP client implementation
629- [@adamdottv](https://github.com/adamdottv) - For the design direction and UI/UX architecture
630
631Special thanks to the broader open source community whose tools and libraries have made this project possible.
632
633## License
634
635OpenCode is licensed under the MIT License. See the [LICENSE](LICENSE) file for details.
636
637## Contributing
638
639Contributions are welcome! Here's how you can contribute:
640
6411. Fork the repository
6422. Create a feature branch (`git checkout -b feature/amazing-feature`)
6433. Commit your changes (`git commit -m 'Add some amazing feature'`)
6444. Push to the branch (`git push origin feature/amazing-feature`)
6455. Open a Pull Request
646
647Please make sure to update tests as appropriate and follow the existing code style.