Configuration
Getting Started

Configuration

All configuration lives in merlin.toml in your repository root. Every setting has a sensible default — only override what you need.

Full reference

merlin.toml
toml
# ── AI Provider ─────────────────────────────────────────────────────
[ai]
# "anthropic" | "openai" | "claude-code" | "gemini" | "bedrock"
# "azure-openai" | "ollama" | "groq" | "together-ai" | "deep-seek"
# "mistral" | "open-router"
provider = "anthropic"
model = "claude-sonnet-4-6"
max_tokens = 4096
temperature = 0.2
# Provider-specific overrides (uncomment as needed)
# ollama_base_url = "http://localhost:11434"
# bedrock_region = "us-east-1"
# azure_openai_endpoint = "https://my-resource.openai.azure.com"
# azure_openai_deployment = "my-gpt4o-deployment"
# ── Review behaviour ─────────────────────────────────────────────────
[review]
focus = ["bugs", "security", "style", "performance"]
max_comments = 30 # cap inline comments per review
chunk_lines = 200 # lines per diff chunk sent to AI
reflect = false # second-pass comment refinement
# ── RAG pipeline ──────────────────────────────────────────────────────
[rag]
enabled = false
embedder = "openai" # "openai" | "ollama"
embed_model = "text-embedding-3-small"
# "local" | "memory" | "qdrant" | "chroma" | "pinecone"
store = "local"
collection = "merlin"
top_k = 5
min_score = 0.70
chunk_lines = 80
index_extensions = [".rs", ".ts", ".py", ".go", ".java", ".md"]
local_path = "merlin-rag.jsonl"
# Ollama embedder
# embedder = "ollama"
# embed_model = "nomic-embed-text"
# ollama_base_url = "http://localhost:11434"
# Qdrant
# qdrant_url = "http://localhost:6333"
# qdrant_api_key = "" # for Qdrant Cloud
# ChromaDB
# chroma_url = "http://localhost:8000"
# Pinecone
# pinecone_host = "https://my-index-xyz.svc.us-east1.pinecone.io"
# pinecone_api_key = "" # or set PINECONE_API_KEY env var
# ── Autonomous agent ──────────────────────────────────────────────────
[agent]
max_iterations = 10
max_memory_messages = 50
# memory_file = ".merlin-memory.jsonl"
default_channel = "cli"
port = 8090

AI providers

Providerprovider valueKey env var
Anthropic ClaudeanthropicANTHROPIC_API_KEY
OpenAI GPT-4oopenaiOPENAI_API_KEY
Google GeminigeminiGEMINI_API_KEY
AWS BedrockbedrockAWS_ACCESS_KEY_ID + AWS_SECRET_ACCESS_KEY
Azure OpenAIazure-openaiAZURE_OPENAI_API_KEY
Claude Code CLIclaude-codeCLAUDE_CODE_TOKEN (headless CI)
GroqgroqGROQ_API_KEY
Together AItogether-aiTOGETHER_API_KEY
DeepSeekdeep-seekDEEPSEEK_API_KEY
Mistral AImistralMISTRAL_API_KEY
OpenRouteropen-routerOPENROUTER_API_KEY
Ollama (local)ollamanone — local server

Environment variables

Secrets are always read from environment variables — never put them in merlin.toml.

AI providers

VariableProvider
ANTHROPIC_API_KEYAnthropic Claude
OPENAI_API_KEYOpenAI (review and/or RAG embeddings)
GEMINI_API_KEYGoogle Gemini
AZURE_OPENAI_API_KEYAzure OpenAI
AWS_ACCESS_KEY_ID + AWS_SECRET_ACCESS_KEYAWS Bedrock
AWS_SESSION_TOKENAWS Bedrock (temporary credentials)
CLAUDE_CODE_TOKENClaude Code CLI headless auth
GROQ_API_KEYGroq
TOGETHER_API_KEYTogether AI
DEEPSEEK_API_KEYDeepSeek
MISTRAL_API_KEYMistral AI
OPENROUTER_API_KEYOpenRouter

VCS platforms

VariablePlatformNotes
GITHUB_TOKENGitHubAuto-provided by Actions
GITLAB_TOKENGitLabUse $CI_JOB_TOKEN in CI
BITBUCKET_TOKENBitbucketUse $BITBUCKET_STEP_TOKEN in CI
AZURE_DEVOPS_TOKENAzure DevOpsUse $(System.AccessToken) in CI
GITEA_TOKENGiteaAuto-provided by Gitea Actions 1.21+

Integrations

VariablePurpose
PINECONE_API_KEYPinecone vector store
SNYK_TOKENSnyk dependency scanning (/snyk command)
JIRA_TOKENJira issue linking (/link_jira command)
LINEAR_API_KEYLinear issue linking (/link_linear command)
SLACK_BOT_TOKENSlack agent channel
DISCORD_BOT_TOKENDiscord agent channel
DISCORD_CHANNEL_IDDiscord channel to post in
MERLIN_GITHUB_SECRETHMAC secret for GitHub webhook verification
MERLIN_GITLAB_SECRETToken for GitLab webhook verification

CLI flags

shell
# Use a different config file
$ merlin --config /path/to/custom.toml review
# Override output format
$ merlin review --output json
$ merlin run /describe --output json
# Local review (no platform API)
$ merlin review --diff changes.diff