README.md

  1# rumilo
  2
  3Rumilo is a CLI that dispatches specialized research subagents. It supports two modes:
  4
  5- `web` for web research (search + fetch, stored in a sandboxed workspace)
  6- `repo` for git repository exploration (clone to workspace, git-aware tools)
  7
  8## Requirements
  9
 10- Bun
 11- Git (for repo mode)
 12- Kagi session token
 13- Tabstack API key
 14
 15## Configuration
 16
 17Rumilo reads configuration from `$XDG_CONFIG_HOME/rumilo/config.toml`.
 18
 19Example:
 20
 21```toml
 22[defaults]
 23model = "anthropic:claude-sonnet-4-20250514"
 24cleanup = true
 25
 26[web]
 27model = "anthropic:claude-sonnet-4-20250514"
 28
 29[repo]
 30model = "anthropic:claude-sonnet-4-20250514"
 31```
 32
 33### Custom Models
 34
 35You can define custom OpenAI-compatible endpoints like Ollama, vLLM, or self-hosted models in the `[custom_models]` section:
 36
 37```toml
 38[custom_models.ollama]
 39provider = "ollama"
 40api = "openai-completions"
 41base_url = "http://localhost:11434/v1"
 42id = "ollama/llama3"
 43name = "Llama 3 (Ollama)"
 44reasoning = false
 45input = ["text"]
 46cost = { input = 0, output = 0 }
 47context_window = 128000
 48max_tokens = 4096
 49```
 50
 51Use custom models with the `custom:` prefix:
 52
 53```bash
 54rumilo web "query" --model custom:ollama
 55rumilo repo -u <uri> "query" --model custom:ollama
 56```
 57
 58#### Custom Model Fields
 59
 60- `provider`: Provider identifier (e.g., "ollama", "custom")
 61- `api`: API type - typically "openai-completions"
 62- `base_url`: API endpoint URL
 63- `id`: Unique model identifier
 64- `name`: Human-readable display name
 65- `reasoning`: Whether the model supports thinking/reasoning
 66- `input`: Input modalities - `["text"]` or `["text", "image"]`
 67- `cost`: Cost per million tokens (can use 0 for local models)
 68- `context_window`: Maximum context size in tokens
 69- `max_tokens`: Maximum output tokens
 70
 71#### Compatibility Flags (Optional)
 72
 73Some OpenAI-compatible endpoints have quirks. Use the `compat` section to override:
 74
 75```toml
 76[custom_models.mistral]
 77provider = "mistral"
 78api = "openai-completions"
 79base_url = "https://api.mistral.ai/v1"
 80# ... other fields ...
 81
 82[custom_models.mistral.compat]
 83max_tokens_field = "max_tokens"
 84requires_tool_result_name = true
 85requires_thinking_as_text = true
 86requires_mistral_tool_ids = true
 87```
 88
 89See [pi-ai documentation](https://deepwiki.com/badlogic/pi-mono/2.6-custom-models-and-compatibility) for all compat flags.
 90
 91### Credentials
 92
 93Set credentials either via config or environment:
 94
 95- `KAGI_SESSION_TOKEN`: Kagi session token
 96- `TABSTACK_API_KEY`: Tabstack API key
 97
 98## Usage
 99
100```bash
101rumilo web "how does X work"
102rumilo web -u https://example.com/docs "explain the auth flow"
103rumilo repo -u https://github.com/org/repo "how is caching implemented"
104```