Quick Start
This guide walks you through generating your first set of AI IDE rules with ruley.
Prerequisites
- ruley installed (see Installation)
- An API key for at least one LLM provider
Step 1: Set Your API Key
Set the environment variable for your chosen provider:
Anthropic
export ANTHROPIC_API_KEY="sk-ant-..."
OpenAI
export OPENAI_API_KEY="sk-..."
Ollama
# No API key needed -- just ensure Ollama is running
ollama serve
OpenRouter
export OPENROUTER_API_KEY="sk-or-..."
Step 2: Generate Rules
Navigate to your project directory and run ruley:
cd /path/to/your/project
ruley
By default, ruley uses Anthropic Claude and generates Cursor format rules.
Step 3: Review the Output
ruley shows you:
- Scan results – How many files were discovered
- Compression stats – Token reduction from tree-sitter compression
- Cost estimate – Estimated LLM cost before proceeding
- Confirmation prompt – You must approve before the LLM call is made
- Generated files – Where the rule files were written
Common Variations
Use a Different Provider
ruley --provider openai --model gpt-4o
Generate Multiple Formats
ruley --format cursor,claude,copilot
Generate All Formats at Once
ruley --format all
Enable Tree-Sitter Compression
ruley --compress
Analyze a Specific Directory
ruley ./my-project --compress
Dry Run (Preview Without Calling LLM)
ruley --dry-run
This shows what would be processed (file count, token estimate, cost) without making any LLM calls. Useful for checking costs before committing.
Skip Cost Confirmation
ruley --no-confirm
Use a Local Ollama Model
ruley --provider ollama --model llama3.1
What Happens Next
The generated rule files are placed in your project directory at the standard locations for each format. Your AI IDE tools will automatically pick them up:
- Cursor:
.cursor/rules/*.mdc– loaded automatically by Cursor IDE - Claude:
CLAUDE.md– read by Claude Code as project context - Copilot:
.github/copilot-instructions.md– loaded by GitHub Copilot - Windsurf:
.windsurfrules– loaded by Windsurf IDE - Aider:
.aider.conf.yml– loaded by Aider CLI
Commit the generated files to your repository so your whole team benefits from consistent AI assistance.
Next Steps
- Command-Line Interface – Full reference for all CLI options
- Configuration – Set up a
ruley.tomlfor your project - LLM Providers – Compare providers and choose the best fit
- Output Formats – Understand what each format produces