Skip to main content
q-ai uses a multi-provider architecture powered by litellm. Credentials are managed via OS keyring (recommended) or environment variables. Most modules require no external configuration — audit and proxy connect directly to MCP servers, IPI generates payloads locally, CXP builds context files locally, and RXP uses local embedding models.

Provider credentials

API keys are needed only for inject campaigns and chain execution, which call LLM provider APIs. q-ai supports any litellm-compatible provider. Store credentials securely in your operating system’s native secret store (Windows Credential Manager, macOS Keychain, Linux Secret Service):
qai config set-credential anthropic
qai config set-credential openai
qai config set-credential groq
Each command prompts for the API key with masked input. Verify configured providers:
qai config list-providers
Remove a credential:
qai config delete-credential anthropic

Environment variables (alternative)

Set the provider’s standard environment variable. q-ai resolves credentials in this order: environment variable → OS keyring → error (no plaintext config file fallback).
VariableProviderRequired for
ANTHROPIC_API_KEYAnthropicinject, chain with Anthropic models
OPENAI_API_KEYOpenAIinject, chain with OpenAI models
GROQ_API_KEYGroqinject, chain with Groq models
MISTRAL_API_KEYMistralinject, chain with Mistral models
COHERE_API_KEYCohereinject, chain with Cohere models
For local models via Ollama, no API key is needed — just ensure your Ollama server is running.

Model strings

q-ai uses the provider/model format for all model references:
qai inject campaign --model anthropic/claude-sonnet-4-20250514 ...
qai inject campaign --model openai/gpt-4o ...
qai inject campaign --model groq/llama-3.3-70b-versatile ...
qai inject campaign --model ollama/llama3 ...
Bare model strings without a provider prefix fall back to anthropic/ for backward compatibility.

Legacy credential migration

If you previously stored API keys in ~/.qai/config.yaml (plaintext), migrate them to the OS keyring:
qai config import-legacy-credentials
This reads keys from the YAML file, writes them to the keyring, backs up the original file, and removes the keys from YAML. Non-secret settings are preserved. Safe to run multiple times.

Data directory

All q-ai data is stored in ~/.qai/:
PathPurpose
~/.qai/qai.dbUnified SQLite database (runs, findings, targets, evidence, module-specific tables)
~/.qai/config.yamlNon-secret configuration settings
~/.qai/artifacts/Generated reports and exports

Modules requiring no configuration

ModuleWhy no config needed
auditConnects directly to MCP servers via stdio, SSE, or Streamable HTTP
proxyIntercepts MCP traffic — no external API calls
ipiGenerates payloads locally, runs a self-hosted callback listener
cxpBuilds context files locally from templates and rules
rxpUses local embedding models via sentence-transformers (downloaded from HuggingFace on first use — no API key needed)
RXP requires the optional [rxp] extra: pip install q-uestionable-ai[rxp]. This installs sentence-transformers and chromadb for local embedding and vector search.