Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.q-uestionable.ai/llms.txt

Use this file to discover all available pages before exploring further.

qai uses LLM providers for injection campaigns (qai inject campaign) and chain execution (qai chain run). Other modules (audit, proxy, ipi, cxp, rxp) do not require LLM providers. Provider support is powered by litellm, giving access to 100+ providers and models.

Provider/Model Format

All model references in qai use the provider/model format:
anthropic/claude-sonnet-4-20250514
openai/gpt-4o
groq/llama-3.3-70b-versatile
ollama/llama3
Bare model names without a provider prefix are treated as Anthropic (e.g., claude-sonnet-4-20250514 becomes anthropic/claude-sonnet-4-20250514).

Storing Credentials (CLI)

Store API keys in the OS keyring using qai config set-credential. The command prompts for the key with masked input — the key is never passed as a command-line argument.
qai config set-credential anthropic
# Prompts: "API key for anthropic:" (masked input)
# Credential for anthropic saved to OS keyring.
Credentials are stored in your OS native secret store:
  • Windows: Credential Manager
  • macOS: Keychain
  • Linux: Secret Service (requires a keyring daemon)
On headless Linux without a keyring daemon, the keyring backend falls back to insecure storage. In that environment, use environment variables instead. See Environment Variables.

View configured providers

qai config list-providers
Displays a table showing each known provider and whether a credential is configured, in the environment, or not set.

Delete credentials

qai config delete-credential anthropic

Migrate legacy plaintext credentials

If you have API keys in ~/.qai/config.yaml from an older version:
qai config import-legacy-credentials
This moves keys from the YAML file to the OS keyring and removes them from the config file. The original file is backed up as config.yaml.bak.

Storing Credentials (Environment Variables)

Set provider API keys as environment variables. These take priority over the keyring.
export ANTHROPIC_API_KEY="sk-ant-..."
export OPENAI_API_KEY="sk-..."
export GROQ_API_KEY="gsk_..."
The naming convention is {PROVIDER_NAME}_API_KEY (uppercase). See Environment Variables for the full list.

Credential Resolution

When a command needs a provider credential, qai checks in this order:
  1. Environment variable (e.g., ANTHROPIC_API_KEY)
  2. OS keyring (stored via qai config set-credential)
  3. Error if neither is found

Settings UI

The web UI Settings page (gear icon in the nav bar) provides a graphical interface for managing providers:
  • Providers section: Add, edit, test, and delete provider credentials. Each provider shows its type (cloud or local) and credential status.
  • Defaults section: Set default provider, model, transport, and callback URL. These pre-populate launcher forms.
To open Settings, run qai and click the gear icon, or navigate to /settings in the browser.

Local Providers

Local providers (Ollama, LM Studio) run on your machine and don’t require API keys. Configure them through the Settings UI or by setting the model string directly:
# Ollama (must be running: ollama serve)
qai inject campaign --model ollama/llama3 --rounds 1

# LM Studio (must have local server running)
qai inject campaign --model lmstudio/your-model-name --rounds 1

Using Providers in Commands

# Inject campaign — requires a provider
qai inject campaign \
  --model anthropic/claude-sonnet-4-20250514 \
  --rounds 2

# Chain execution — requires a provider
qai chain run \
  --chain-file my_chain.yaml \
  --inject-model openai/gpt-4o \
  --targets ~/.qai/chain-targets.yaml

# Audit — does NOT require a provider
qai audit scan --transport stdio --command "npx @modelcontextprotocol/server-memory"