Keyboard shortcuts

Press or to navigate between chapters

Press S or / to search in the book

Press ? to show this help

Press Esc to hide this help

Command-Line Interface

Usage

ruley [OPTIONS] [PATH]

PATH: Path to repository (local path or remote URL). Defaults to . (current directory).

Options

Core Options

FlagEnv VariableDefaultDescription
-p, --provider <NAME>RULEY_PROVIDERanthropicLLM provider (anthropic, openai, ollama, openrouter)
-m, --model <NAME>RULEY_MODEL(provider default)Model to use
-f, --format <FORMATS>RULEY_FORMATcursorOutput format(s), comma-separated
-o, --output <PATH>RULEY_OUTPUT(format default)Output file path (single format only)
-c, --config <PATH>RULEY_CONFIGruley.tomlConfig file path

Generation Options

FlagEnv VariableDefaultDescription
--description <TEXT>RULEY_DESCRIPTION(none)Focus area for rule generation
--rule-type <TYPE>RULEY_RULE_TYPEautoCursor rule type (auto, always, manual, agent-requested)
--compressRULEY_COMPRESSfalseEnable tree-sitter compression
--chunk-size <N>RULEY_CHUNK_SIZE100000Max tokens per LLM chunk
--repomix-file <PATH>RULEY_REPOMIX_FILE(none)Use pre-packed repomix file as input

Filtering Options

FlagDescription
--include <PATTERN>Include only matching files (repeatable)
--exclude <PATTERN>Exclude matching files (repeatable)

Behavior Options

FlagEnv VariableDefaultDescription
--no-confirmRULEY_NO_CONFIRMfalseSkip cost confirmation prompt
--dry-runRULEY_DRY_RUNfalseShow plan without calling LLM
--on-conflict <STRATEGY>RULEY_ON_CONFLICTpromptConflict resolution (prompt, overwrite, skip, smart-merge)
--retry-on-validation-failurefalseAuto-retry with LLM fix on validation failure
--no-deconflictfalseDisable LLM-based deconfliction with existing rules
--no-semantic-validationfalseDisable all semantic validation checks

Output Options

FlagDescription
-vIncrease verbosity (-v = DEBUG, -vv = TRACE)
-qSuppress non-essential output
--versionPrint version information
--helpPrint help information

Environment Variables

All CLI flags can be set via RULEY_* environment variables. CLI flags take precedence over environment variables, which take precedence over config file values.

Provider API Keys

VariableProviderRequired
ANTHROPIC_API_KEYAnthropicWhen using --provider anthropic
OPENAI_API_KEYOpenAIWhen using --provider openai
OLLAMA_HOSTOllamaOptional (default: http://localhost:11434)
OPENROUTER_API_KEYOpenRouterWhen using --provider openrouter

Examples

Basic Usage

# Analyze current directory with defaults
ruley

# Analyze a specific project
ruley /path/to/project

Provider Selection

# Use OpenAI with a specific model
ruley --provider openai --model gpt-4o

# Use local Ollama
ruley --provider ollama --model llama3.1

# Use OpenRouter with Claude
ruley --provider openrouter --model anthropic/claude-3.5-sonnet

Format Control

# Generate Cursor rules (default)
ruley --format cursor

# Generate multiple formats
ruley --format cursor,claude,copilot

# Generate all formats
ruley --format all

# Write to a specific path (single format only)
ruley --format claude --output ./docs/CLAUDE.md

Compression and Performance

# Enable tree-sitter compression (~70% token reduction)
ruley --compress

# Adjust chunk size for large codebases
ruley --chunk-size 200000

# Use a pre-packed repomix file
ruley --repomix-file ./codebase.xml

Cost Management

# Preview without calling the LLM
ruley --dry-run

# Skip the cost confirmation prompt
ruley --no-confirm

Conflict Resolution

# Overwrite existing rule files
ruley --on-conflict overwrite

# Skip if files already exist
ruley --on-conflict skip

# Use LLM to smart-merge with existing rules
ruley --on-conflict smart-merge

Filtering Files

# Only include Rust files
ruley --include "**/*.rs"

# Exclude test directories
ruley --exclude "**/tests/**" --exclude "**/benches/**"