open_ai: Enable parallel tool calling for models that support it (#52203)

Bennet Bo Fenner created

## Context

We seemed to have disabled it in #28056, because our agent did not
support parallel tool calls at the time.

## Self-Review Checklist

<!-- Check before requesting review: -->
- [x] I've reviewed my own diff for quality, security, and reliability
- [x] Unsafe blocks (if any) have justifying comments
- [x] The content is consistent with the [UI/UX
checklist](https://github.com/zed-industries/zed/blob/main/CONTRIBUTING.md#uiux-checklist)
- [x] Tests cover the new/changed behavior
- [x] Performance impact has been considered and is acceptable

Release Notes:

- N/A or Added/Fixed/Improved ...

Change summary

crates/language_models/src/provider/open_ai.rs | 3 +--
1 file changed, 1 insertion(+), 2 deletions(-)

Detailed changes

crates/language_models/src/provider/open_ai.rs 🔗

@@ -514,8 +514,7 @@ pub fn into_open_ai(
         temperature: request.temperature.or(Some(1.0)),
         max_completion_tokens: max_output_tokens,
         parallel_tool_calls: if supports_parallel_tool_calls && !request.tools.is_empty() {
-            // Disable parallel tool calls, as the Agent currently expects a maximum of one per turn.
-            Some(false)
+            Some(supports_parallel_tool_calls)
         } else {
             None
         },