ollama: Change default context size to 4096 (#31682)

tidely created

Ollama increased their default context size from 2048 to 4096 tokens in
version v0.6.7, which released over a month ago.

https://github.com/ollama/ollama/releases/tag/v0.6.7

Release Notes:

- ollama: Update default model context to 4096 (matching upstream)

Change summary

crates/ollama/src/ollama.rs | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)

Detailed changes

crates/ollama/src/ollama.rs 🔗

@@ -42,7 +42,7 @@ pub struct Model {
 
 fn get_max_tokens(name: &str) -> usize {
     /// Default context length for unknown models.
-    const DEFAULT_TOKENS: usize = 2048;
+    const DEFAULT_TOKENS: usize = 4096;
     /// Magic number. Lets many Ollama models work with ~16GB of ram.
     const MAXIMUM_TOKENS: usize = 16384;