@@ -13,13 +13,14 @@ Here's an overview of the supported providers and tool call support:
| ----------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| [Amazon Bedrock](#amazon-bedrock) | Depends on the model |
| [Anthropic](#anthropic) | β
|
-| [DeepSeek](#deepseek) | π« |
+| [DeepSeek](#deepseek) | β
|
| [GitHub Copilot Chat](#github-copilot-chat) | For Some Models ([link](https://github.com/zed-industries/zed/blob/9e0330ba7d848755c9734bf456c716bddf0973f3/crates/language_models/src/provider/copilot_chat.rs#L189-L198)) |
| [Google AI](#google-ai) | β
|
| [LM Studio](#lmstudio) | β
|
| [Mistral](#mistral) | β
|
| [Ollama](#ollama) | β
|
| [OpenAI](#openai) | β
|
+| [OpenRouter](#openrouter) | β
|
| [OpenAI API Compatible](#openai-api-compatible) | π« |
## Use Your Own Keys {#use-your-own-keys}
@@ -164,7 +165,7 @@ You can configure a model to use [extended thinking](https://docs.anthropic.com/
### DeepSeek {#deepseek}
-> π« Does not support tool use
+> β
Supports tool use
1. Visit the DeepSeek platform and [create an API key](https://platform.deepseek.com/api_keys)
2. Open the settings view (`agent: open configuration`) and go to the DeepSeek section
@@ -351,7 +352,9 @@ Depending on your hardware or use-case you may wish to limit or increase the con
"name": "qwen2.5-coder",
"display_name": "qwen 2.5 coder 32K",
"max_tokens": 32768,
- "supports_tools": true
+ "supports_tools": true,
+ "supports_thinking": true,
+ "supports_images": true
}
]
}
@@ -371,6 +374,12 @@ The `supports_tools` option controls whether or not the model will use additiona
If the model is tagged with `tools` in the Ollama catalog this option should be supplied, and built in profiles `Ask` and `Write` can be used.
If the model is not tagged with `tools` in the Ollama catalog, this option can still be supplied with value `true`; however be aware that only the `Minimal` built in profile will work.
+The `supports_thinking` option controls whether or not the model will perform an explicit βthinkingβ (reasoning) pass before producing its final answer.
+If the model is tagged with `thinking` in the Ollama catalog, set this option and you can use it in zed.
+
+The `supports_images` option enables the modelβs vision capabilities, allowing it to process images included in the conversation context.
+If the model is tagged with `vision` in the Ollama catalog, set this option and you can use it in zed.
+
### OpenAI {#openai}
> β
Supports tool use
@@ -416,6 +425,21 @@ You must provide the model's Context Window in the `max_tokens` parameter; this
OpenAI `o1` models should set `max_completion_tokens` as well to avoid incurring high reasoning token costs.
Custom models will be listed in the model dropdown in the Agent Panel.
+### OpenRouter {#openrouter}
+
+> β
Supports tool use
+
+OpenRouter provides access to multiple AI models through a single API. It supports tool use for compatible models.
+
+1. Visit [OpenRouter](https://openrouter.ai) and create an account
+2. Generate an API key from your [OpenRouter keys page](https://openrouter.ai/keys)
+3. Open the settings view (`agent: open configuration`) and go to the OpenRouter section
+4. Enter your OpenRouter API key
+
+The OpenRouter API key will be saved in your keychain.
+
+Zed will also use the `OPENROUTER_API_KEY` environment variable if it's defined.
+
### OpenAI API Compatible {#openai-api-compatible}
Zed supports using OpenAI compatible APIs by specifying a custom `endpoint` and `available_models` for the OpenAI provider.