lsp: Fix LSP restart breaking semantic token highlighting (#51452)

Finn Eitreim created

Closes #51450 

When you restart the lsp, it does not sufficiently clear cached data
regarding the semantic tokens, if using semantic_tokens = "full", this
would mean that you would have no syntax highlighting. also, toggling on
and off semantic tokens in the menu would have no effect. this change
properly clears the cached state and things work again!

Before:


https://github.com/user-attachments/assets/67ac1be1-ae3d-4c84-afbc-056fd81f63f0

After:


https://github.com/user-attachments/assets/644f8297-8003-4d74-b962-81ba9bb8274c

You might notice that the syntax highlighting is quite spare in the
videos, especially compared to the non semantic token based
highlighting, and you would be correct! but thats just how it is with
`semantic_tokens: "full"`, other editors, like neovim, provide basic
syntax highlighting that zed doesn't (because it doesn't need to with
treesitter usually, but here treesitter is disabled), however if we turn
off that syntax highlighting we can see that neovim actually matches zed
here:

<img width="847" height="485" alt="Screenshot 2026-03-12 at 11 33 19 PM"
src="https://github.com/user-attachments/assets/7f90789c-dac3-41bf-9d19-640c6c7b1144"
/>


Before you mark this PR as ready for review, make sure that you have:
- [x] Added a solid test coverage and/or screenshots from doing manual
testing
- [x] Done a self-review taking into account security and performance
aspects
- [x] Aligned any UI changes with the [UI
checklist](https://github.com/zed-industries/zed/blob/main/CONTRIBUTING.md#uiux-checklist)

Release Notes:

- lsp: Fixed restarting the LSP breaking semantic token highlighting.

Change summary

crates/project/src/lsp_store.rs                 | 5 +----
crates/project/src/lsp_store/semantic_tokens.rs | 8 ++++++++
2 files changed, 9 insertions(+), 4 deletions(-)

Detailed changes

crates/project/src/lsp_store.rs 🔗

@@ -3963,10 +3963,7 @@ impl BufferLspData {
         self.inlay_hints.remove_server_data(for_server);
 
         if let Some(semantic_tokens) = &mut self.semantic_tokens {
-            semantic_tokens.raw_tokens.servers.remove(&for_server);
-            semantic_tokens
-                .latest_invalidation_requests
-                .remove(&for_server);
+            semantic_tokens.remove_server_data(for_server);
         }
 
         if let Some(folding_ranges) = &mut self.folding_ranges {

crates/project/src/lsp_store/semantic_tokens.rs 🔗

@@ -610,6 +610,14 @@ pub struct SemanticTokensData {
     update: Option<(Global, SemanticTokensTask)>,
 }
 
+impl SemanticTokensData {
+    pub(super) fn remove_server_data(&mut self, server_id: LanguageServerId) {
+        self.raw_tokens.servers.remove(&server_id);
+        self.latest_invalidation_requests.remove(&server_id);
+        self.update = None;
+    }
+}
+
 /// All the semantic token tokens for a buffer.
 ///
 /// This aggregates semantic tokens from multiple language servers in a specific order.