language_models: Make `JsonSchemaSubset` the default `tool_input_format` for the OpenAI-compatible provider (#34921)

Umesh Yadav and Peter Tripp created

Closes #30188
Closes #34911
Closes #34906

Many OpenAI-compatible providers do not automatically filter the tool
schema to comply with the underlying model's requirements; they simply
proxy the request. This creates issues, as models like **Gemini**,
**Grok**, and **Claude** (when accessed via LiteLLM on Bedrock) are
incompatible with Zed's default tool schema.

This PR addresses this by defaulting to a more compatible schema subset
instead of the full schema.

### Why this approach?

* **Avoids Poor User Experience:** One alternative was to add an option
for users to manually set the JSON schema for models that return a `400
Bad Request` due to an invalid tool schema. This was discarded as it
provides a poor user experience.
* **Simplifies Complex Logic:** Another option was to filter the schema
based on the model ID. However, as demonstrated in the attached issues,
this is unreliable. For instance, `claude-4-sonnet` fails when proxied
through LiteLLM on Bedrock. Reliably determining behavior would require
a non-trivial implementation to manage provider-and-model combinations.
* **Better Default Behavior:** The current approach ensures that tool
usage works out-of-the-box for the majority of cases by default,
providing the most robust and user-friendly solution.


Release Notes:

- Improved tool compatibility with OpenAI API-compatible providers

Signed-off-by: Umesh Yadav <git@umesh.dev>
Co-authored-by: Peter Tripp <peter@zed.dev>

Change summary

crates/language_models/src/provider/open_ai_compatible.rs | 6 +++++-
1 file changed, 5 insertions(+), 1 deletion(-)

Detailed changes

crates/language_models/src/provider/open_ai_compatible.rs 🔗

@@ -9,7 +9,7 @@ use language_model::{
     AuthenticateError, LanguageModel, LanguageModelCompletionError, LanguageModelCompletionEvent,
     LanguageModelId, LanguageModelName, LanguageModelProvider, LanguageModelProviderId,
     LanguageModelProviderName, LanguageModelProviderState, LanguageModelRequest,
-    LanguageModelToolChoice, RateLimiter,
+    LanguageModelToolChoice, LanguageModelToolSchemaFormat, RateLimiter,
 };
 use menu;
 use open_ai::{ResponseStreamEvent, stream_completion};
@@ -322,6 +322,10 @@ impl LanguageModel for OpenAiCompatibleLanguageModel {
         self.model.capabilities.tools
     }
 
+    fn tool_input_format(&self) -> LanguageModelToolSchemaFormat {
+        LanguageModelToolSchemaFormat::JsonSchemaSubset
+    }
+
     fn supports_images(&self) -> bool {
         self.model.capabilities.images
     }