Getting Started
Configuration
All configuration lives in ferret.toml in your repository root. Every setting has a sensible default — only override what you need.
Full reference
ferret.toml
toml# ── AI Provider ─────────────────────────────────────────────────────[ai]# "anthropic" | "openai" | "claude-code" | "gemini" | "bedrock" | "ollama"provider = "anthropic"model = "claude-sonnet-4-6"max_tokens = 4096temperature = 0.2# Ollama / Bedrock overrides# ollama_base_url = "http://localhost:11434"# bedrock_region = "us-east-1"# ── Review behaviour ─────────────────────────────────────────────────[review]focus = ["bugs", "security", "style", "performance"]max_comments = 30 # cap inline comments per reviewchunk_lines = 200 # lines per diff chunk sent to AIreflect = false # second-pass comment refinement# ── RAG pipeline ──────────────────────────────────────────────────────[rag]enabled = false# "local" | "memory" | "qdrant" | "chroma" | "pinecone"store = "local"collection = "ferret"embed_model = "nomic-embed-text"ollama_base_url = "http://localhost:11434"top_k = 5min_score = 0.70chunk_lines = 80index_extensions = [".rs", ".ts", ".py", ".go", ".java", ".md"]local_path = "ferret-rag.jsonl"# Qdrant# qdrant_url = "http://localhost:6333"# qdrant_api_key = "" # for Qdrant Cloud# ChromaDB# chroma_url = "http://localhost:8000"# Pinecone# pinecone_host = "https://my-index-xyz.svc.us-east1.pinecone.io"# pinecone_api_key = "" # or set PINECONE_API_KEY env var# ── Autonomous agent ──────────────────────────────────────────────────[agent]max_iterations = 10max_memory_messages = 50# memory_file = ".ferret-memory.jsonl"default_channel = "cli"port = 8090
AI providers
| Provider | Key | Env var |
|---|---|---|
| Anthropic Claude | anthropic | ANTHROPIC_API_KEY |
| OpenAI GPT-4o | openai | OPENAI_API_KEY |
| Claude Code CLI | claude-code | none (uses claude binary) |
| Google Gemini | gemini | GEMINI_API_KEY |
| AWS Bedrock | bedrock | AWS_ACCESS_KEY_ID + AWS_SECRET_ACCESS_KEY |
| Ollama | ollama | none (local server) |
Environment variables
Secrets are always read from environment variables — never put them in ferret.toml.
| Variable | Purpose |
|---|---|
ANTHROPIC_API_KEY | Claude API key |
OPENAI_API_KEY | OpenAI API key |
GEMINI_API_KEY | Google Gemini API key |
GITHUB_TOKEN | GitHub token (auto-provided in Actions) |
GITLAB_TOKEN | GitLab token ($CI_JOB_TOKEN in CI) |
BITBUCKET_TOKEN | Bitbucket bearer token |
AZURE_DEVOPS_TOKEN | Azure DevOps PAT |
GITEA_TOKEN | Gitea API token |
PINECONE_API_KEY | Pinecone API key |
SNYK_TOKEN | Snyk API token (for /snyk command) |
SLACK_BOT_TOKEN | Slack bot token (agent channel) |
DISCORD_BOT_TOKEN | Discord bot token (agent channel) |
DISCORD_CHANNEL_ID | Discord channel to monitor |
FERRET_GITHUB_SECRET | Webhook HMAC secret (bot mode) |
CLI flags
shell
# Use a different config file$ ferret --config /path/to/custom.toml review# Override output format$ ferret review --output json$ ferret run /describe --output json# Local review (no platform API)$ ferret review --diff changes.diff