assistant2: Handle LLM providers that do not emit `StartMessage` events (#23485)

Marshall Bowers created

This PR updates Assistant2's response streaming to work with LLM
providers that do not emit `StartMessage` events.

Now if we get a `Text` event without having received a `StartMessage`
event we will still insert an Assistant message so we can stream in the
response from the model.

Release Notes:

- N/A

Change summary

crates/assistant2/src/thread.rs | 7 +++++++
1 file changed, 7 insertions(+)

Detailed changes

crates/assistant2/src/thread.rs 🔗

@@ -308,6 +308,13 @@ impl Thread {
                                             last_message.id,
                                             chunk,
                                         ));
+                                    } else {
+                                        // If we won't have an Assistant message yet, assume this chunk marks the beginning
+                                        // of a new Assistant response.
+                                        //
+                                        // Importantly: We do *not* want to emit a `StreamedAssistantText` event here, as it
+                                        // will result in duplicating the text of the chunk in the rendered Markdown.
+                                        thread.insert_message(Role::Assistant, chunk, cx);
                                     }
                                 }
                             }