1# Assistant Tooling
2
3Bringing OpenAI compatible tool calling to GPUI.
4
5This unlocks:
6
7- **Structured Extraction** of model responses
8- **Validation** of model inputs
9- **Execution** of chosen toolsn
10
11## Overview
12
13Language Models can produce structured outputs that are perfect for calling functions. The most famous of these is OpenAI's tool calling. When make a chat completion you can pass a list of tools available to the model. The model will choose `0..n` tools to help them complete a user's task. It's up to _you_ to create the tools that the model can call.
14
15> **User**: "Hey I need help with implementing a collapsible panel in GPUI"
16>
17> **Assistant**: "Sure, I can help with that. Let me see what I can find."
18>
19> `tool_calls: ["name": "query_codebase", arguments: "{ 'query': 'GPUI collapsible panel' }"]`
20>
21> `result: "['crates/gpui/src/panel.rs:12: impl Panel { ... }', 'crates/gpui/src/panel.rs:20: impl Panel { ... }']"`
22>
23> **Assistant**: "Here are some excerpts from the GPUI codebase that might help you."
24
25This library is designed to facilitate this interaction mode by allowing you to go from `struct` to `tool` with a simple trait, `LanguageModelTool`.
26
27## Example
28
29Let's expose querying a semantic index directly by the model. First, we'll set up some _necessary_ imports
30
31```rust
32use anyhow::Result;
33use assistant_tooling::{LanguageModelTool, ToolRegistry};
34use gpui::{App, AppContext, Task};
35use schemars::JsonSchema;
36use serde::Deserialize;
37use serde_json::json;
38```
39
40Then we'll define the query structure the model must fill in. This _must_ derive `Deserialize` from `serde` and `JsonSchema` from the `schemars` crate.
41
42```rust
43#[derive(Deserialize, JsonSchema)]
44struct CodebaseQuery {
45 query: String,
46}
47```
48
49After that we can define our tool, with the expectation that it will need a `ProjectIndex` to search against. For this example, the index uses the same interface as `semantic_index::ProjectIndex`.
50
51```rust
52struct ProjectIndex {}
53
54impl ProjectIndex {
55 fn new() -> Self {
56 ProjectIndex {}
57 }
58
59 fn search(&self, _query: &str, _limit: usize, _cx: &AppContext) -> Task<Result<Vec<String>>> {
60 // Instead of hooking up a real index, we're going to fake it
61 if _query.contains("gpui") {
62 return Task::ready(Ok(vec![r#"// crates/gpui/src/gpui.rs
63 //! # Welcome to GPUI!
64 //!
65 //! GPUI is a hybrid immediate and retained mode, GPU accelerated, UI framework
66 //! for Rust, designed to support a wide variety of applications
67 "#
68 .to_string()]));
69 }
70 return Task::ready(Ok(vec![]));
71 }
72}
73
74struct ProjectIndexTool {
75 project_index: ProjectIndex,
76}
77```
78
79Now we can implement the `LanguageModelTool` trait for our tool by:
80
81- Defining the `Input` from the model, which is `CodebaseQuery`
82- Defining the `Output`
83- Implementing the `name` and `description` functions to provide the model information when it's choosing a tool
84- Implementing the `execute` function to run the tool
85
86```rust
87impl LanguageModelTool for ProjectIndexTool {
88 type Input = CodebaseQuery;
89 type Output = String;
90
91 fn name(&self) -> String {
92 "query_codebase".to_string()
93 }
94
95 fn description(&self) -> String {
96 "Executes a query against the codebase, returning excerpts related to the query".to_string()
97 }
98
99 fn execute(&self, query: Self::Input, cx: &AppContext) -> Task<Result<Self::Output>> {
100 let results = self.project_index.search(query.query.as_str(), 10, cx);
101
102 cx.spawn(|_cx| async move {
103 let results = results.await?;
104
105 if !results.is_empty() {
106 Ok(results.join("\n"))
107 } else {
108 Ok("No results".to_string())
109 }
110 })
111 }
112}
113```
114
115For the sake of this example, let's look at the types that OpenAI will be passing to us
116
117```rust
118// OpenAI definitions, shown here for demonstration
119#[derive(Deserialize)]
120struct FunctionCall {
121 name: String,
122 args: String,
123}
124
125#[derive(Deserialize, Eq, PartialEq)]
126enum ToolCallType {
127 #[serde(rename = "function")]
128 Function,
129 Other,
130}
131
132#[derive(Deserialize, Clone, Debug, Eq, PartialEq, Hash, Ord, PartialOrd)]
133struct ToolCallId(String);
134
135#[derive(Deserialize)]
136#[serde(tag = "type", rename_all = "snake_case")]
137enum ToolCall {
138 Function {
139 #[allow(dead_code)]
140 id: ToolCallId,
141 function: FunctionCall,
142 },
143 Other {
144 #[allow(dead_code)]
145 id: ToolCallId,
146 },
147}
148
149#[derive(Deserialize)]
150struct AssistantMessage {
151 role: String,
152 content: Option<String>,
153 tool_calls: Option<Vec<ToolCall>>,
154}
155```
156
157When the model wants to call tools, it will pass a list of `ToolCall`s. When those are `function`s that we can handle, we'll pass them to our `ToolRegistry` to get a future that we can await.
158
159```rust
160// Inside `fn main()`
161App::new().run(|cx: &mut AppContext| {
162 let tool = ProjectIndexTool {
163 project_index: ProjectIndex::new(),
164 };
165
166 let mut registry = ToolRegistry::new();
167 let registered = registry.register(tool);
168 assert!(registered.is_ok());
169```
170
171Let's pretend the model sent us back a message requesting
172
173```rust
174let model_response = json!({
175 "role": "assistant",
176 "tool_calls": [
177 {
178 "id": "call_1",
179 "function": {
180 "name": "query_codebase",
181 "args": r#"{"query":"GPUI Task background_executor"}"#
182 },
183 "type": "function"
184 }
185 ]
186});
187
188let message: AssistantMessage = serde_json::from_value(model_response).unwrap();
189
190// We know there's a tool call, so let's skip straight to it for this example
191let tool_calls = message.tool_calls.as_ref().unwrap();
192let tool_call = tool_calls.get(0).unwrap();
193```
194
195We can now use our registry to call the tool.
196
197```rust
198let task = registry.call(
199 tool_call.name,
200 tool_call.args,
201);
202
203cx.spawn(|_cx| async move {
204 let result = task.await?;
205 println!("{}", result.unwrap());
206 Ok(())
207})
208```