Modular architecture to detect and prevent private data leakage in user-LLM interactions.
Implements a LangGraph-based multiagent firewall for advanced data leakage detection and policy management.
Provides a FastAPI server as a bridge to connect proxy and extension packages with the multiagent firewall
Chromium based extension that analyzes user and LLM interactions to detect sensitive data and provide feedback to the user within the browser.
Protect user and LLM interactions via command-line clients, IDEs or applications by routing their LLM API calls through the multiagent firewall.
flowchart TB
subgraph Browser["🌐 Browser Usage"]
USER1[User on Web LLM Chatbot]
EXT[Extension]
LLMCHATBOT[LLM Chatbot Website]
USER1 -->|text / file| EXT
EXT -->|warns about detection results| USER1
EXT -.->|forwards when safe or allowed by the user| LLMCHATBOT
end
subgraph SystemWide["💻 API Usage"]
USER2[User on CLI/IDE/App]
PROXY[Proxy]
LLMAPI[LLM API Providers]
USER2 -->|LLM API calls| PROXY
PROXY -->|403 block or allow| USER2
PROXY -.->|forwards when safe| LLMAPI
end
subgraph Backend["🔌 Backend"]
API[FastAPI Server<br/><small>/detect endpoint</small>]
FIREWALL[Multiagent Firewall<br/><small>LangGraph Pipeline</small>]
API -->|invoke | FIREWALL
FIREWALL -->|detection result| API
end
EXT -->|POST /detect<br/>text or file| API
PROXY -->|POST /detect<br/>text or file| API
API -->|detection result| EXT
API -->|detection result| PROXY
style Browser fill:#e3f2fd,stroke:#1976d2,stroke-width:2px,color:#000
style SystemWide fill:#f3e5f5,stroke:#7b1fa2,stroke-width:2px,color:#000
style Backend fill:#fff3e0,stroke:#f57c00,stroke-width:2px,color:#000
linkStyle default stroke:#000,stroke-width:2px
Install uv (modern Python package manager):
backend: Copybackend/.env.exampletobackend/.env(server settings).multiagent-firewall: Copymultiagent-firewall/.env.exampletomultiagent-firewall/.env(LLM, OCR, NER settings). Customize detection pipeline viamultiagent-firewall/config/pipeline.jsonand detection options viamultiagent-firewall/config/detection.json.proxy: Copyproxy/.env.exampletoproxy/.envand configure to your liking.extension: Modifyextension/src/config.js
The backend package simplifies the connection between the proxy and extension modules.
cd backend && uv sync && uv run python -m app.mainNote
Alternatively, you can build the backend image using the provided Dockerfile:
docker build -t sensitive-data-detector .
docker run -p 8000:8000 --env-file .env sensitive-data-detector- Go to chrome://extensions/
- Toggle on "Developer mode"
- Click "Load unpacked" → choose path to
sensitive-data-detector/extension/
The extension will intercept web chatbots interactions (ChatGPT, Gemini...) and provide feedback to the user regarding any potential sensitive information leakage based on the configured options.
Detailed information on how to run the proxy package under proxy/README.md
The proxy will act as a middleman between the user and any listed endpoint under proxy/.env
MIT license.