Convert natural‑language questions into SQL for the InteLIS database. Runs a query plan through an LLM, validates the SQL, executes it, and returns results (with privacy rules enforced).
- Install deps
composer install
- Export the DB schema (required)
php ./bin/export-schema.php
This generates var/schema.json
. Re‑run it whenever the database schema changes.
-
Web UI: open
/chat
and ask a question. -
API:
-
Ask a question
POST /ask Content-Type: application/json { "q": "How many VL tests in the last 6 months?", "provider": "ollama|openai|anthropic", "model": "optional-model-id" }
Response (minimal shape)
{ "sql": "SELECT …", "rows": [ { "col": "val" } ], "timing": { "provider": "…", "model_used": "…", "total_ms": 0 } }
-
Clear conversation context (reset the server-side context window)
POST /ask Content-Type: application/json { "clear_context": true }
Response
{ "message": "Conversation context cleared", "context_reset": true }
Note: when
clear_context
istrue
, it is handled immediately and anyq
value (if present) is ignored for that request.
-
High-level flow High-level flow
- User asks a question (UI
/chat
orPOST /ask
). - Context is built: user query + conversation history +
var/schema.json
+ business rules + field guide → prompt to the selected LLM. - LLM generates SQL (QueryService validates & enforces privacy rules).
- SQL is executed against MySQL (DatabaseService).
- Charts are generated based on the query output.
- Results are returned to the caller (rows, counts, timing, debug info, charts) and conversation context is updated.
flowchart TD
U["User Query in Natural Language (via UI or API)"] --> QS["<strong>QueryService generates Prompt for LLM.</strong> <br><br>User Query, Schema, Business rules, and Field Guides are attached to Prompt "]
SCH["Database Schema - without any actual data"] --> QS
BR["Predefined Business Rules"] --> QS
FG["Predefined Field Guide"] --> QS
CTX[Conversation context/history] --> QS
QS --> LLM["LLM generates SQL based on Prompt"]
LLM --> VAL["Validate SQL & enforce privacy"]
VAL --> DB[(MySQL executes SQL)]
DB --> CS["Chart generation based on Query Output"]
CS --> RESP[Response JSON with Query Output + Charts]
RESP --> CTX
Works with Ollama, OpenAI, and Anthropic. Pick a provider/model in the /chat
settings or send provider
/model
in the /ask
payload.
- Privacy rules prevent returning disallowed columns.
- If you see “model not found”, use an explicit model id (e.g., for Anthropic use a dated id).
- If SQL generation looks off after schema changes, re‑export the schema (
export-schema.php
).