1# Configuring the Assistant
2
3## Providers {#providers}
4
5The following providers are supported:
6
7- [Zed AI (Configured by default when signed in)](#zed-ai)
8- [Anthropic](#anthropic)
9- [GitHub Copilot Chat](#github-copilot-chat) [^1]
10- [Google Gemini](#google-gemini) [^1]
11- [Ollama](#ollama)
12- [OpenAI](#openai)
13
14To configure different providers, run `assistant: show configuration` in the command palette, or click on the hamburger menu at the top-right of the assistant panel and select "Configure".
15
16[^1]: This provider does not support [`/workflow`](./commands#workflow-not-generally-available) command.
17
18To further customize providers, you can use `settings.json` to do that as follows:
19
20- [Configuring endpoints](#custom-endpoint)
21- [Configuring timeouts](#provider-timeout)
22- [Configuring default model](#default-model)
23
24### Zed AI {#zed-ai}
25
26A hosted service providing convenient and performant support for AI-enabled coding in Zed, powered by Anthropic's Claude 3.5 Sonnet and accessible just by signing in.
27
28### Anthropic {#anthropic}
29
30You can use Claude 3.5 Sonnet via [Zed AI](#zed-ai) for free. To use other Anthropic models you will need to configure it by providing your own API key.
31
321. You can obtain an API key [here](https://console.anthropic.com/settings/keys).
332. Make sure that your Anthropic account has credits
343. Open the configuration view (`assistant: show configuration`) and navigate to the Anthropic section
354. Enter your Anthropic API key
36
37Even if you pay for Claude Pro, you will still have to [pay for additional credits](https://console.anthropic.com/settings/plans) to use it via the API.
38
39#### Anthropic Custom Models {#anthropic-custom-models}
40
41You can add custom models to the Anthropic provider, by adding the following to your Zed `settings.json`:
42
43```json
44{
45 "language_models": {
46 "anthropic": {
47 "available_models": [
48 {
49 "name": "some-model",
50 "display_name": "some-model",
51 "max_tokens": 128000,
52 "max_output_tokens": 2560,
53 "cache_configuration": {
54 "max_cache_anchors": 10,
55 "min_total_token": 10000,
56 "should_speculate": false
57 },
58 "tool_override": "some-model-that-supports-toolcalling"
59 }
60 ]
61 }
62 }
63}
64```
65
66Custom models will be listed in the model dropdown in the assistant panel.
67
68### GitHub Copilot Chat {#github-copilot-chat}
69
70You can use GitHub Copilot chat with the Zed assistant by choosing it via the model dropdown in the assistant panel.
71
72### Google Gemini {#google-gemini}
73
74You can use Gemini 1.5 Pro/Flash with the Zed assistant by choosing it via the model dropdown in the assistant panel.
75
76You can obtain an API key [here](https://aistudio.google.com/app/apikey).
77
78#### Google Gemini Custom Models {#google-custom-models}
79
80You can add custom models to the OpenAI provider, by adding the following to your Zed `settings.json`:
81
82```json
83{
84 "language_models": {
85 "google": {
86 "available_models": [
87 {
88 "name": "custom-model",
89 "max_tokens": 128000
90 }
91 ]
92 }
93 }
94}
95```
96
97Custom models will be listed in the model dropdown in the assistant panel.
98
99### Ollama {#ollama}
100
101Download and install Ollama from [ollama.com/download](https://ollama.com/download) (Linux or macOS) and ensure it's running with `ollama --version`.
102
103You can use Ollama with the Zed assistant by making Ollama appear as an OpenAPI endpoint.
104
1051. Download, for example, the `mistral` model with Ollama:
106
107 ```sh
108 ollama pull mistral
109 ```
110
1112. Make sure that the Ollama server is running. You can start it either via running the Ollama app, or launching:
112
113 ```sh
114 ollama serve
115 ```
116
1173. In the assistant panel, select one of the Ollama models using the model dropdown.
1184. (Optional) If you want to change the default URL that is used to access the Ollama server, you can do so by adding the following settings:
119
120```json
121{
122 "language_models": {
123 "ollama": {
124 "api_url": "http://localhost:11434"
125 }
126 }
127}
128```
129
130### OpenAI {#openai}
131
1321. Create an [OpenAI API key](https://platform.openai.com/account/api-keys)
1332. Make sure that your OpenAI account has credits
1343. Open the configuration view (`assistant: show configuration`) and navigate to the OpenAI section
1354. Enter your OpenAI API key
136
137The OpenAI API key will be saved in your keychain.
138
139Zed will also use the `OPENAI_API_KEY` environment variable if it's defined.
140
141#### OpenAI Custom Models {#openai-custom-models}
142
143You can add custom models to the OpenAI provider, by adding the following to your Zed `settings.json`:
144
145```json
146{
147 "language_models": {
148 "openai": {
149 "version": "1",
150 "available_models": [
151 {
152 "name": "custom-model",
153 "max_tokens": 128000
154 }
155 ]
156 }
157 }
158}
159```
160
161Custom models will be listed in the model dropdown in the assistant panel.
162
163### Advanced configuration {#advanced-configuration}
164
165#### Example Configuration
166
167```json
168{
169 "assistant": {
170 "enabled": true,
171 "default_model": {
172 "provider": "zed.dev",
173 "model": "claude-3-5-sonnet"
174 },
175 "version": "2",
176 "button": true,
177 "default_width": 480,
178 "dock": "right"
179 }
180}
181```
182
183#### Custom endpoints {#custom-endpoint}
184
185You can use a custom API endpoint for different providers, as long as it's compatible with the API structure.
186
187To do so, add the following to your Zed `settings.json`:
188
189```json
190{
191 "language_models": {
192 "some-provider": {
193 "api_url": "http://localhost:11434/v1"
194 }
195 }
196}
197```
198
199Where `some-provider` can be any of the following values: `anthropic`, `google`, `ollama`, `openai`.
200
201#### Custom timeout {#provider-timeout}
202
203You can customize the timeout that's used for LLM requests, by adding the following to your Zed `settings.json`:
204
205```json
206{
207 "language_models": {
208 "some-provider": {
209 "low_speed_timeout_in_seconds": 10
210 }
211 }
212}
213```
214
215Where `some-provider` can be any of the following values: `anthropic`, `copilot_chat`, `google`, `ollama`, `openai`.
216
217#### Configuring the default model {#default-model}
218
219The default model can be changed by clicking on the model dropdown (top-right) in the assistant panel.
220Picking a model will save it as the default model. You can still change the default model manually, by editing the `default_model` object in the settings. The `default_model` object can contain the following keys:
221
222```json
223{
224 "assistant": {
225 "version": "2",
226 "default_model": {
227 "provider": "zed.dev",
228 "model": "claude-3-5-sonnet"
229 }
230 }
231}
232```
233
234#### Common Panel Settings
235
236| key | type | default | description |
237| -------------- | ------- | ------- | ------------------------------------------------------------------------------------- |
238| enabled | boolean | true | Disabling this will completely disable the assistant |
239| button | boolean | true | Show the assistant icon |
240| dock | string | "right" | The default dock position for the assistant panel. Can be ["left", "right", "bottom"] |
241| default_height | string | null | The pixel height of the assistant panel when docked to the bottom |
242| default_width | string | null | The pixel width of the assistant panel when docked to the left or right |