Provider and Model
Set the provider and model via CLI:assist.model value is the model name without a provider prefix. The runtime builds the full provider/model string from assist.provider and assist.model automatically. See LLM Provider Configuration for the full list of supported providers.
Environment Variable Override
You can set provider and model via environment variables instead of the config store:Base URL
If your assistant’s LLM provider runs on a remote host or non-default port, set a custom base URL. This is useful when targeting a remote Ollama instance, LM Studio on another machine, or any custom OpenAI-compatible endpoint.assist.base_url controls only the assistant’s LLM endpoint. It is completely independent of the base URLs used for testing targets (e.g., ollama.base_url). You can point the assistant at one Ollama instance while running inject campaigns against a different one.http://localhost:11434 for Ollama).
Credentials
Cloud providers require an API key. Store credentials using the same keyring/environment variable pattern as other qai providers:ollama, lmstudio, and custom — skip the credential check entirely. If you’re using Ollama, no API key is needed; just ensure Ollama is running (ollama serve).
See LLM Provider Configuration for credential management details.
Embedding Model
The knowledge base uses a sentence-transformers model to generate embeddings for retrieval. The default isall-MiniLM-L6-v2, which runs locally and requires no API key.
To override:
The embedding model runs locally via sentence-transformers regardless of which LLM provider you use. If you change the embedding model, run
qai assist reindex to rebuild the knowledge base with the new embeddings.User Knowledge Directory
The assistant indexes user-provided reference files from~/.qai/knowledge/ by default. To use a different directory:
Configuration Summary
| Setting | CLI | Env Var | Default |
|---|---|---|---|
| Provider | qai config set assist.provider | QAI_ASSIST_PROVIDER | None (required) |
| Model | qai config set assist.model | QAI_ASSIST_MODEL | None (required) |
| Base URL | qai config set assist.base_url | QAI_ASSIST_BASE_URL | None (provider default) |
| Embedding model | qai config set assist.embedding_model | QAI_ASSIST_EMBEDDING_MODEL | all-MiniLM-L6-v2 |
| Knowledge directory | qai config set assist.knowledge_dir | QAI_ASSIST_KNOWLEDGE_DIR | ~/.qai/knowledge/ |
| Credentials | qai config set-credential <provider> | {PROVIDER}_API_KEY | None (required for cloud) |