Configuration
Getting Started

Configuration

All configuration lives in ferret.toml in your repository root. Every setting has a sensible default — only override what you need.

Full reference

ferret.toml
toml
# ── AI Provider ─────────────────────────────────────────────────────
[ai]
# "anthropic" | "openai" | "claude-code" | "gemini" | "bedrock" | "ollama"
provider = "anthropic"
model = "claude-sonnet-4-6"
max_tokens = 4096
temperature = 0.2
# Ollama / Bedrock overrides
# ollama_base_url = "http://localhost:11434"
# bedrock_region = "us-east-1"
# ── Review behaviour ─────────────────────────────────────────────────
[review]
focus = ["bugs", "security", "style", "performance"]
max_comments = 30 # cap inline comments per review
chunk_lines = 200 # lines per diff chunk sent to AI
reflect = false # second-pass comment refinement
# ── RAG pipeline ──────────────────────────────────────────────────────
[rag]
enabled = false
# "local" | "memory" | "qdrant" | "chroma" | "pinecone"
store = "local"
collection = "ferret"
embed_model = "nomic-embed-text"
ollama_base_url = "http://localhost:11434"
top_k = 5
min_score = 0.70
chunk_lines = 80
index_extensions = [".rs", ".ts", ".py", ".go", ".java", ".md"]
local_path = "ferret-rag.jsonl"
# Qdrant
# qdrant_url = "http://localhost:6333"
# qdrant_api_key = "" # for Qdrant Cloud
# ChromaDB
# chroma_url = "http://localhost:8000"
# Pinecone
# pinecone_host = "https://my-index-xyz.svc.us-east1.pinecone.io"
# pinecone_api_key = "" # or set PINECONE_API_KEY env var
# ── Autonomous agent ──────────────────────────────────────────────────
[agent]
max_iterations = 10
max_memory_messages = 50
# memory_file = ".ferret-memory.jsonl"
default_channel = "cli"
port = 8090

AI providers

ProviderKeyEnv var
Anthropic ClaudeanthropicANTHROPIC_API_KEY
OpenAI GPT-4oopenaiOPENAI_API_KEY
Claude Code CLIclaude-codenone (uses claude binary)
Google GeminigeminiGEMINI_API_KEY
AWS BedrockbedrockAWS_ACCESS_KEY_ID + AWS_SECRET_ACCESS_KEY
Ollamaollamanone (local server)

Environment variables

Secrets are always read from environment variables — never put them in ferret.toml.

VariablePurpose
ANTHROPIC_API_KEYClaude API key
OPENAI_API_KEYOpenAI API key
GEMINI_API_KEYGoogle Gemini API key
GITHUB_TOKENGitHub token (auto-provided in Actions)
GITLAB_TOKENGitLab token ($CI_JOB_TOKEN in CI)
BITBUCKET_TOKENBitbucket bearer token
AZURE_DEVOPS_TOKENAzure DevOps PAT
GITEA_TOKENGitea API token
PINECONE_API_KEYPinecone API key
SNYK_TOKENSnyk API token (for /snyk command)
SLACK_BOT_TOKENSlack bot token (agent channel)
DISCORD_BOT_TOKENDiscord bot token (agent channel)
DISCORD_CHANNEL_IDDiscord channel to monitor
FERRET_GITHUB_SECRETWebhook HMAC secret (bot mode)

CLI flags

shell
# Use a different config file
$ ferret --config /path/to/custom.toml review
# Override output format
$ ferret review --output json
$ ferret run /describe --output json
# Local review (no platform API)
$ ferret review --diff changes.diff