TachyonicTachyonic

Provider Setup

Configure API keys and endpoints for each supported LLM provider

Supported Providers

ProviderFlagEnv VariableDefault Endpoint
Anthropic--provider anthropicANTHROPIC_API_KEYapi.anthropic.com/v1/messages
OpenAI--provider open-aiOPENAI_API_KEYapi.openai.com/v1/chat/completions
Google AI--provider geminiGOOGLE_API_KEYGemini API (URL key)
Mistral--provider mistralMISTRAL_API_KEYapi.mistral.ai/v1/chat/completions
DeepSeek--provider deep-seekDEEPSEEK_API_KEYapi.deepseek.com/chat/completions
Groq--provider groqGROQ_API_KEYapi.groq.com/openai/v1/chat/completions
Together AI--provider together-aiTOGETHER_API_KEYapi.together.xyz/v1/chat/completions
Ollama--provider ollamalocalhost:11434

Anthropic

export ANTHROPIC_API_KEY=sk-ant-api03-...

tachyonic scan --provider anthropic --model claude-haiku-4-5-20251001

No --target needed — uses the default Anthropic endpoint.

Session Auth (OAuth)

For Anthropic OAuth tokens (e.g., from Claude Code):

tachyonic login --provider anthropic
# Follow prompts to paste OAuth token

tachyonic scan --provider anthropic --auth-mode session

OpenAI

export OPENAI_API_KEY=sk-...

tachyonic scan --provider open-ai --model gpt-4o

Session Auth (Codex)

Tachyonic can reuse credentials from Codex CLI. If you have Codex installed and logged in:

tachyonic login --provider open-ai
# Reuses existing Codex credentials from ~/.codex/auth.json

tachyonic scan --provider open-ai --auth-mode session

If Codex isn't installed, use --device-auth for device flow guidance:

tachyonic login --provider open-ai --device-auth

Google AI (Gemini)

export GOOGLE_API_KEY=AI...

tachyonic scan --provider gemini --model gemini-2.5-flash

Ollama (Local)

Run models locally with no API key:

# Start Ollama
ollama serve

# Pull a model
ollama pull llama3.1

# Scan
tachyonic scan --provider ollama --model llama3.1

Ollama uses the OpenAI-compatible API at localhost:11434.

Groq

export GROQ_API_KEY=gsk_...

tachyonic scan --provider groq --model llama-3.1-70b-versatile

Together AI

export TOGETHER_API_KEY=...

tachyonic scan --provider together-ai --model meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo

DeepSeek

export DEEPSEEK_API_KEY=...

tachyonic scan --provider deep-seek --model deepseek-chat

Mistral

export MISTRAL_API_KEY=...

tachyonic scan --provider mistral --model mistral-large-latest

Custom Endpoints

For OpenAI-compatible APIs not listed above:

tachyonic scan \
  --target https://your-api.com/v1/chat/completions \
  --api-key your-key-here \
  --model your-model-name

Tachyonic auto-detects OpenAI-compatible request/response format when no --provider is set.

On this page