docs(readme): add Ollama and LMStudio config (#538)

Kujtim Hoxha and Christian Rocha created

* docs: improve readme with more info
* chore: small punctuation changes
* chore: small markdwon improvements
* docs(readme): copyedits
* docs(readme): remove implicit colon
* docs(readme): fix ephemeral config path on Widows

---------

Co-authored-by: Christian Rocha <christian@rocha.is>

Change summary

README.md | 67 ++++++++++++++++++++++++++++++++++++++++++++++++++++----
1 file changed, 62 insertions(+), 5 deletions(-)

Detailed changes

README.md 🔗

@@ -138,19 +138,29 @@ Crush runs great with no configuration. That said, if you do need or want to
 customize Crush, configuration can be added either local to the project itself,
 or globally, with the following priority:
 
-1. `./.crush.json`
-2. `./crush.json`
-3. `$HOME/.config/crush/crush.json`
+1. `.crush.json`
+2. `crush.json`
+3. `$HOME/.config/crush/crush.json` (Windows: `%USERPROFILE%\AppData\Local\crush\crush.json`)
 
 Configuration itself is stored as a JSON object:
 
 ```json
 {
-   "this-setting": { }
-   "that-setting": { }
+   "this-setting": {"this": "that"},
+   "that-setting": ["ceci", "cela"]
 }
 ```
 
+As an additional note, Crush also stores ephemeral data, such as application state, in one additional location:
+
+```bash
+# Unix
+$HOME/.local/shared/crush/crush.json
+
+# Windows
+%LOCALAPPDATA%\crush\crush.json
+```
+
 ### LSPs
 
 Crush can use LSPs for additional context to help inform its decisions, just
@@ -245,6 +255,53 @@ permissions. Use this with care.
 You can also skip all permission prompts entirely by running Crush with the
 `--yolo` flag. Be very, very careful with this feature.
 
+### Local Models
+
+Local models can also be configured via OpenAI-compatible API. Here are two common examples:
+
+#### Ollama
+
+```json
+{
+  "providers": {
+    "ollama": {
+      "name": "Ollama",
+      "base_url": "http://localhost:11434/v1/",
+      "type": "openai",
+      "models": [
+        {
+          "name": "Qwen 3 30B",
+          "id": "qwen3:30b",
+          "context_window": 256000,
+          "default_max_tokens": 20000
+        }
+      ]
+    }
+}
+```
+
+#### LM Studio
+
+```json
+{
+  "providers": {
+    "lmstudio": {
+      "name": "LM Studio",
+      "base_url": "http://localhost:1234/v1/",
+      "type": "openai",
+      "models": [
+        {
+          "name": "Qwen 3 30B",
+          "id": "qwen/qwen3-30b-a3b-2507",
+          "context_window": 256000,
+          "default_max_tokens": 20000
+        }
+      ]
+    }
+  }
+}
+```
+
 ### Custom Providers
 
 Crush supports custom provider configurations for both OpenAI-compatible and