Keyboard shortcuts

Press or to navigate between chapters

Press S or / to search in the book

Press ? to show this help

Press Esc to hide this help

LLM Providers

ruley supports multiple LLM providers. Each provider is feature-gated at compile time and requires its own API key (except Ollama).

Provider Comparison

ProviderAPI Key RequiredLocalDefault ModelContext Window
AnthropicYesNoclaude-sonnet-4-5-20250929200K tokens
OpenAIYesNogpt-4o128K tokens
OllamaNoYesllama3.1:70b~100K tokens
OpenRouterYesNoanthropic/claude-3.5-sonnetVaries by model

Anthropic

Anthropic’s Claude models are the default provider and generally produce excellent rule quality.

Setup

export ANTHROPIC_API_KEY="sk-ant-..."

Usage

# Uses default model (Claude Sonnet 4.5)
ruley --provider anthropic

# Specify a model
ruley --provider anthropic --model claude-sonnet-4-5-20250929

Config File

[general]
provider = "anthropic"

[providers.anthropic]
model = "claude-sonnet-4-5-20250929"
max_tokens = 8192

OpenAI

OpenAI’s GPT models provide strong rule generation with fast response times.

Setup

export OPENAI_API_KEY="sk-..."

Usage

ruley --provider openai --model gpt-4o

Config File

[general]
provider = "openai"

[providers.openai]
model = "gpt-4o"
max_tokens = 4096

Ollama

Ollama runs models locally. No API key is needed, and there are no per-token costs. This is ideal for privacy-sensitive codebases or offline use.

Setup

  1. Install Ollama
  2. Pull a model: ollama pull llama3.1:70b
  3. Start the server: ollama serve

Usage

ruley --provider ollama --model llama3.1

# Custom Ollama host
OLLAMA_HOST="http://192.168.1.100:11434" ruley --provider ollama

Config File

[general]
provider = "ollama"

[providers.ollama]
host = "http://localhost:11434"
model = "llama3.1:70b"

Considerations

  • Rule quality depends heavily on the model size. Larger models (70B+) produce better results.
  • Local models have smaller context windows. Use --compress and --chunk-size to manage large codebases.
  • No cost confirmation is shown since Ollama is free to use.

OpenRouter

OpenRouter provides access to models from multiple providers through a single API. It fetches dynamic pricing from the OpenRouter API for accurate cost estimation.

Setup

export OPENROUTER_API_KEY="sk-or-..."

Usage

ruley --provider openrouter --model anthropic/claude-3.5-sonnet

Config File

[general]
provider = "openrouter"

[providers.openrouter]
model = "anthropic/claude-3.5-sonnet"
max_tokens = 8192

Feature Flags

Providers are compiled in via Cargo feature flags. The default build includes anthropic and openai.

FeatureProvider
anthropicAnthropic (default)
openaiOpenAI (default)
ollamaOllama
openrouterOpenRouter
all-providersAll of the above

To include all providers when building from source:

cargo install ruley --features all-providers

Choosing a Provider

  • Best quality: Anthropic Claude (default) – excellent at understanding code conventions
  • Fastest: OpenAI GPT-4o – lower latency per request
  • Free / Private: Ollama – no API costs, data stays local
  • Flexible: OpenRouter – access to many models through one API