@@ -435,21 +435,24 @@ To do it via your `settings.json`, add the following snippet under `language_mod
```json
{
"language_models": {
- "openai": {
- "api_url": "https://api.together.xyz/v1", // Using Together AI as an example
- "available_models": [
- {
- "name": "mistralai/Mixtral-8x7B-Instruct-v0.1",
- "display_name": "Together Mixtral 8x7B",
- "max_tokens": 32768,
- "capabilities": {
- "tools": true,
- "images": false,
- "parallel_tool_calls": false,
- "prompt_cache_key": false
+ "openai_compatible": {
+ // Using Together AI as an example
+ "Together AI": {
+ "api_url": "https://api.together.xyz/v1",
+ "available_models": [
+ {
+ "name": "mistralai/Mixtral-8x7B-Instruct-v0.1",
+ "display_name": "Together Mixtral 8x7B",
+ "max_tokens": 32768,
+ "capabilities": {
+ "tools": true,
+ "images": false,
+ "parallel_tool_calls": false,
+ "prompt_cache_key": false
+ }
}
- }
- ]
+ ]
+ }
}
}
}
@@ -463,7 +466,7 @@ By default, OpenAI-compatible models inherit the following capabilities:
- `prompt_cache_key`: false (does not support `prompt_cache_key` parameter)
Note that LLM API keys aren't stored in your settings file.
-So, ensure you have it set in your environment variables (`OPENAI_API_KEY=<your api key>`) so your settings can pick it up.
+So, ensure you have it set in your environment variables (`<PROVIDER_NAME>_API_KEY=<your api key>`) so your settings can pick it up. In the example above, it would be `TOGETHER_AI_API_KEY=<your api key>`.
### OpenRouter {#openrouter}