Provider Setup
Configure API keys and endpoints for each supported LLM provider
Supported Providers
| Provider | Flag | Env Variable | Default Endpoint |
|---|---|---|---|
| Anthropic | --provider anthropic | ANTHROPIC_API_KEY | api.anthropic.com/v1/messages |
| OpenAI | --provider open-ai | OPENAI_API_KEY | api.openai.com/v1/chat/completions |
| Google AI | --provider gemini | GOOGLE_API_KEY | Gemini API (URL key) |
| Mistral | --provider mistral | MISTRAL_API_KEY | api.mistral.ai/v1/chat/completions |
| DeepSeek | --provider deep-seek | DEEPSEEK_API_KEY | api.deepseek.com/chat/completions |
| Groq | --provider groq | GROQ_API_KEY | api.groq.com/openai/v1/chat/completions |
| Together AI | --provider together-ai | TOGETHER_API_KEY | api.together.xyz/v1/chat/completions |
| Ollama | --provider ollama | — | localhost:11434 |
Anthropic
export ANTHROPIC_API_KEY=sk-ant-api03-...
tachyonic scan --provider anthropic --model claude-haiku-4-5-20251001No --target needed — uses the default Anthropic endpoint.
Session Auth (OAuth)
For Anthropic OAuth tokens (e.g., from Claude Code):
tachyonic login --provider anthropic
# Follow prompts to paste OAuth token
tachyonic scan --provider anthropic --auth-mode sessionOpenAI
export OPENAI_API_KEY=sk-...
tachyonic scan --provider open-ai --model gpt-4oSession Auth (Codex)
Tachyonic can reuse credentials from Codex CLI. If you have Codex installed and logged in:
tachyonic login --provider open-ai
# Reuses existing Codex credentials from ~/.codex/auth.json
tachyonic scan --provider open-ai --auth-mode sessionIf Codex isn't installed, use --device-auth for device flow guidance:
tachyonic login --provider open-ai --device-authGoogle AI (Gemini)
export GOOGLE_API_KEY=AI...
tachyonic scan --provider gemini --model gemini-2.5-flashOllama (Local)
Run models locally with no API key:
# Start Ollama
ollama serve
# Pull a model
ollama pull llama3.1
# Scan
tachyonic scan --provider ollama --model llama3.1Ollama uses the OpenAI-compatible API at localhost:11434.
Groq
export GROQ_API_KEY=gsk_...
tachyonic scan --provider groq --model llama-3.1-70b-versatileTogether AI
export TOGETHER_API_KEY=...
tachyonic scan --provider together-ai --model meta-llama/Meta-Llama-3.1-70B-Instruct-TurboDeepSeek
export DEEPSEEK_API_KEY=...
tachyonic scan --provider deep-seek --model deepseek-chatMistral
export MISTRAL_API_KEY=...
tachyonic scan --provider mistral --model mistral-large-latestCustom Endpoints
For OpenAI-compatible APIs not listed above:
tachyonic scan \
--target https://your-api.com/v1/chat/completions \
--api-key your-key-here \
--model your-model-nameTachyonic auto-detects OpenAI-compatible request/response format when no --provider is set.