language_models: Actually override Ollama model from settings (#38628)
Umesh Yadav
created
The current problem is that if I specify model parameters, like
`max_tokens`, in `settings.json` for an Ollama model, they do not
override the values coming from the Ollama API. Instead, the parameters
from the API are used. For example, in the settings below, even though I
have overridden `max_tokens`, Zed will still use the API's default
`context_length` of 4k.
```
"language_models": {
"ollama": {
"available_models": [
{
"name": "qwen3-coder:latest",
"display_name": "Qwen 3 Coder",
"max_tokens": 64000,
"supports_tools": true,
"keep_alive": "15m",
"supports_thinking": false,
"supports_images": false
}
]
}
},
```
Release Notes:
- Fixed an issue where Ollama model parameters were not being correctly
overridden by user settings.