Documentation Index
Fetch the complete documentation index at: https://docs.q-uestionable.ai/llms.txt
Use this file to discover all available pages before exploring further.
The callback server is a FastAPI application that receives HTTP callbacks from AI agents that execute hidden payloads.
Default Configuration
| Setting | Default | Description |
|---|
| Host | 127.0.0.1 | Network interface to bind to |
| Port | 8080 | TCP port to listen on |
| Notify URL | http://127.0.0.1:8899 | Main qai web server for hit notifications |
The database directory (~/.qai/) is created automatically on first run.
CLI Configuration
All server settings are configured via CLI flags on qai ipi listen:
# Default — localhost only
qai ipi listen
# Bind to all interfaces on port 9090
qai ipi listen --host 0.0.0.0 --port 9090
| Option | Type | Default | Description |
|---|
--host, -h | TEXT | 127.0.0.1 | Network interface to bind to |
--port, -p | INT | 8080 | TCP port to listen on |
--notify-url | TEXT | http://127.0.0.1:8899 | Where to send internal hit notifications |
Callback URL Structure
The listener accepts callbacks at two URL patterns:
| Path | Token Validated | Description |
|---|
/c/<uuid>/<token> | Yes | Authenticated callback — token checked against database |
/c/<uuid> | No | Unauthenticated callback — recorded with token_valid=False |
Both GET and POST methods are accepted. POST bodies are captured for exfil payload types.
All callback endpoints return a fake 404 response to avoid alerting the target system.
Per-Campaign Authentication
Cryptographic tokens are generated automatically when payloads are created. No separate authentication configuration is needed.
The token is embedded in the callback URL during payload generation:
qai ipi generate --callback http://localhost:8080
# Generates URLs like: http://localhost:8080/c/a1b2c3d4-.../e5f6g7h8...
Confidence Scoring
Confidence is assigned automatically based on token validity and User-Agent analysis:
| Level | Criteria |
|---|
| HIGH | Valid campaign token present |
| MEDIUM | No/invalid token + programmatic User-Agent |
| LOW | No/invalid token + browser/scanner User-Agent |
Confidence thresholds are not user-configurable. The programmatic User-Agent pattern matches: python-requests, httpx, aiohttp, urllib, curl, wget, node-fetch, axios, got, undici, fetch, llm, openai, langchain.
Bridge Token
The IPI listener runs as a separate process from the main qai web server. When a hit arrives, the listener notifies the web server via an internal HTTP POST so the web UI can update in real time via WebSocket.
This internal communication is authenticated with a shared bridge token:
- Location:
~/.qai/bridge.token
- Format: 32 hex characters, generated via
secrets.token_hex(16)
- File permissions:
0600 on POSIX
- Auto-generated: On first access by either process (race-safe exclusive file creation)
Both processes read from the same file — no manual configuration needed.
The listener includes the token in the Authorization header of its internal POST. If the tokens don’t match (stale file), the notification is rejected — the hit is still recorded in the database but won’t appear in the web UI until the page is refreshed.
If WebSocket hit notifications stop working, delete ~/.qai/bridge.token and restart both the web server and the listener. A fresh token will be generated automatically.
Health Check
The server exposes a health endpoint at /health that returns {"status": "ok"}.
Public exposure and tunnel mode
When the listener runs with --tunnel cloudflare (see Remote callbacks via Cloudflare Tunnel), a few behaviors differ from a bare listener bound to 127.0.0.1 or a plain --host 0.0.0.0.
Forwarded-header trust. In tunnel mode the server resolves each hit’s source_ip from the CF-Connecting-IP header rather than the TCP peer. This trust is scoped to tunnel mode only — a bare --host 0.0.0.0 listener does not trust forwarded headers.
Hardening in tunnel mode. Tunneled listeners activate defensive measures against public-internet exposure: request body size limits, per-peer rate limiting, and a startup warning banner. These are defaults for tunnel mode and cannot be silently disabled.
State file permissions. The ~/.qai/active-callback state file is created with mode 0o600 on POSIX (owner read/write only). Windows has no direct equivalent; on Windows the file is protected only by the default user profile ACL. This is best-effort and is called out here as an explicit known limitation, not a hardening claim.
Network Considerations
The callback listener (qai ipi listen) is separate from the qai Web UI. Only the callback listener should be exposed to external networks. The Web UI must bind to 127.0.0.1 (localhost) only unless you have configured proper authentication. Do not forward or publish the Web UI host/port via ngrok, Cloudflare Tunnel, or public IPs — doing so exposes an unauthenticated management interface.
For testing against cloud-hosted AI targets, the listener must be reachable from the target system:
- Local testing:
http://localhost:8080 works when the agent runs on the same machine
- Network testing: Use your machine’s LAN IP (e.g.,
http://192.168.1.100:8080)
- Cloud targets: Use a public IP, or a tunneling service like ngrok or Cloudflare Tunnel
When generating payloads, use the externally reachable address as the --callback URL:
# Start the listener
qai ipi listen --host 0.0.0.0 --port 8080
# In another terminal, expose it
ngrok http 8080
# Use the ngrok URL when generating payloads
qai ipi generate --callback https://abc123.ngrok.io