Keyboard shortcuts

Press or to navigate between chapters

Press S or / to search in the book

Press ? to show this help

Press Esc to hide this help

Quick Start

This guide walks you through generating your first set of AI IDE rules with ruley.

Prerequisites

  1. ruley installed (see Installation)
  2. An API key for at least one LLM provider

Step 1: Set Your API Key

Set the environment variable for your chosen provider:

Anthropic

export ANTHROPIC_API_KEY="sk-ant-..."

OpenAI

export OPENAI_API_KEY="sk-..."

Ollama

# No API key needed -- just ensure Ollama is running
ollama serve

OpenRouter

export OPENROUTER_API_KEY="sk-or-..."

Step 2: Generate Rules

Navigate to your project directory and run ruley:

cd /path/to/your/project
ruley

By default, ruley uses Anthropic Claude and generates Cursor format rules.

Step 3: Review the Output

ruley shows you:

  1. Scan results – How many files were discovered
  2. Compression stats – Token reduction from tree-sitter compression
  3. Cost estimate – Estimated LLM cost before proceeding
  4. Confirmation prompt – You must approve before the LLM call is made
  5. Generated files – Where the rule files were written

Common Variations

Use a Different Provider

ruley --provider openai --model gpt-4o

Generate Multiple Formats

ruley --format cursor,claude,copilot

Generate All Formats at Once

ruley --format all

Enable Tree-Sitter Compression

ruley --compress

Analyze a Specific Directory

ruley ./my-project --compress

Dry Run (Preview Without Calling LLM)

ruley --dry-run

This shows what would be processed (file count, token estimate, cost) without making any LLM calls. Useful for checking costs before committing.

Skip Cost Confirmation

ruley --no-confirm

Use a Local Ollama Model

ruley --provider ollama --model llama3.1

What Happens Next

The generated rule files are placed in your project directory at the standard locations for each format. Your AI IDE tools will automatically pick them up:

  • Cursor: .cursor/rules/*.mdc – loaded automatically by Cursor IDE
  • Claude: CLAUDE.md – read by Claude Code as project context
  • Copilot: .github/copilot-instructions.md – loaded by GitHub Copilot
  • Windsurf: .windsurfrules – loaded by Windsurf IDE
  • Aider: .aider.conf.yml – loaded by Aider CLI

Commit the generated files to your repository so your whole team benefits from consistent AI assistance.

Next Steps