1# Assistant Tooling
2
3Bringing Language Model tool calling to GPUI.
4
5This unlocks:
6
7- **Structured Extraction** of model responses
8- **Validation** of model inputs
9- **Execution** of chosen tools
10
11## Overview
12
13Language Models can produce structured outputs that are perfect for calling functions. The most famous of these is OpenAI's tool calling. When making a chat completion you can pass a list of tools available to the model. The model will choose `0..n` tools to help them complete a user's task. It's up to _you_ to create the tools that the model can call.
14
15> **User**: "Hey I need help with implementing a collapsible panel in GPUI"
16>
17> **Assistant**: "Sure, I can help with that. Let me see what I can find."
18>
19> `tool_calls: ["name": "query_codebase", arguments: "{ 'query': 'GPUI collapsible panel' }"]`
20>
21> `result: "['crates/gpui/src/panel.rs:12: impl Panel { ... }', 'crates/gpui/src/panel.rs:20: impl Panel { ... }']"`
22>
23> **Assistant**: "Here are some excerpts from the GPUI codebase that might help you."
24
25This library is designed to facilitate this interaction mode by allowing you to go from `struct` to `tool` with two simple traits, `LanguageModelTool` and `ToolView`.
26
27## Using the Tool Registry
28
29```rust
30let mut tool_registry = ToolRegistry::new();
31tool_registry
32 .register(WeatherTool { api_client },
33 })
34 .unwrap(); // You can only register one tool per name
35
36let completion = cx.update(|cx| {
37 CompletionProvider::get(cx).complete(
38 model_name,
39 messages,
40 Vec::new(),
41 1.0,
42 // The definitions get passed directly to OpenAI when you want
43 // the model to be able to call your tool
44 tool_registry.definitions(),
45 )
46});
47
48let mut stream = completion?.await?;
49
50let mut message = AssistantMessage::new();
51
52while let Some(delta) = stream.next().await {
53 // As messages stream in, you'll get both assistant content
54 if let Some(content) = &delta.content {
55 message
56 .body
57 .update(cx, |message, cx| message.append(&content, cx));
58 }
59
60 // And tool calls!
61 for tool_call_delta in delta.tool_calls {
62 let index = tool_call_delta.index as usize;
63 if index >= message.tool_calls.len() {
64 message.tool_calls.resize_with(index + 1, Default::default);
65 }
66 let tool_call = &mut message.tool_calls[index];
67
68 // Build up an ID
69 if let Some(id) = &tool_call_delta.id {
70 tool_call.id.push_str(id);
71 }
72
73 tool_registry.update_tool_call(
74 tool_call,
75 tool_call_delta.name.as_deref(),
76 tool_call_delta.arguments.as_deref(),
77 cx,
78 );
79 }
80}
81```
82
83Once the stream of tokens is complete, you can exexute the tool call by calling `tool_registry.execute_tool_call(tool_call, cx)`, which returns a `Task<Result<()>>`.
84
85As the tokens stream in and tool calls are executed, your `ToolView` will get updates. Render each tool call by passing that `tool_call` in to `tool_registry.render_tool_call(tool_call, cx)`. The final message for the model can be pulled by calling `self.tool_registry.content_for_tool_call( tool_call, &mut project_context, cx, )`.