docs: Document how to use custom `api_url` in Assistant (#11790)

Thorsten Ball created

This essentially documents the comment here:
https://github.com/zed-industries/zed/issues/4424#issuecomment-2053646583

Release Notes:


- N/A

Change summary

docs/src/assistant-panel.md | 55 +++++++++++++++++++++++++++++++++++++++
1 file changed, 55 insertions(+)

Detailed changes

docs/src/assistant-panel.md 🔗

@@ -78,3 +78,58 @@ After you submit your first message, a name for your conversation is generated b
 ## Multiple cursor demo
 
 The assistant is capable of sending multiple requests, and receiving multiple responses, in parallel. [Here's a demo](https://zed.dev/img/post/assistant/demo.webm).
+
+## Using a custom API endpoint for OpenAI
+
+You can use a custom API endpoint for OpenAI, as long as it's compatible with the OpenAI API structure.
+
+To do so, add the following to your Zed `settings.json`:
+
+```json
+{
+  "assistant": {
+    "version": "1",
+    "provider": {
+      "name": "openai",
+      "type": "openai",
+      "default_model": "gpt-4-turbo-preview",
+      "api_url": "http://localhost:11434/v1"
+    }
+  }
+}
+```
+
+The custom URL here is `http://localhost:11434/v1`.
+
+## Using Ollama on macOS
+
+You can use Ollama with the Zed assistant by making Ollama appear as an OpenAPI endpoint.
+
+1. Add the following to your Zed `settings.json`:
+
+  ```json
+  {
+    "assistant": {
+      "version": "1",
+      "provider": {
+        "name": "openai",
+        "type": "openai",
+        "default_model": "gpt-4-turbo-preview",
+        "api_url": "http://localhost:11434/v1"
+      }
+    }
+  }
+  ```
+2. Download, for example, the `mistral` model with Ollama:
+  ```
+  ollama run mistral
+  ```
+3. Copy the model and change its name to match the model in the Zed `settings.json`:
+  ```
+  ollama cp mistral gpt-4-turbo-preview
+  ```
+4. Use `assistant: reset key` (see the [Setup](#setup) section above) and enter the following API key:
+  ```
+  ollama
+  ```
+5. Restart Zed