Track AI traffic and analytics for any site.
That means both:
- AI crawlers/bots hitting your site (GPTBot, ClaudeBot, PerplexityBot, Bytespider, etc.)
- Human visits + conversions referred by AI tools (ChatGPT, Perplexity, Gemini, Copilot, Claude, DeepSeek, etc.)
The goal: answer "What did AI send me?" and "What did AI crawl?" with hard numbers.
packages/
registry/ AI bot + referrer classification data + matching functions
referral-snippet/ Drop-in <script> tag for tracking AI-referred human traffic
log-parser/ CLI to parse nginx/CloudFront logs and classify bot traffic
server-collector/ Express server: /ingest endpoint + /r/:code shortlink redirects
collector-config/ Docker Compose stack: Grafana + Prometheus + Tempo + OTel Collector
npm installnpm run buildnpm testnpm run dev:server
# Listening on http://localhost:3456Maintained registry of 30+ AI bots and 19 AI referrer sources. Exposes classification functions:
import { classifyBot, classifyReferrer } from "@llm-telemetry/registry";
classifyBot("GPTBot/1.0");
// { isBot: true, name: "gptbot", operator: "OpenAI", purpose: "Training data collection..." }
classifyReferrer("https://chatgpt.com/");
// { isAIReferrer: true, name: "chatgpt", operator: "OpenAI" }Drop-in <script> tag that detects AI referral traffic and beacons events to your endpoint:
<script src="https://cdn.example.com/snippet.js"
data-endpoint="https://yoursite.com/api/ingest"
data-site-id="my-site">
</script>Emits ai_pageview on load. Exposes __llmTelemetry.trackConversion(name, value) for conversions.
CLI tool to parse server logs and produce AI bot traffic aggregates:
npx llm-log-parser parse access.log --format nginx --output csv
npx llm-log-parser parse cloudfront.log --format cloudfront --output json --bots-onlyExpress server with:
POST /ingest-- receive beacon events from the snippetGET /events-- query stored eventsGET /r/:code-- redirect shortlinks with first-party cookie + UTMsPOST /shortlinks-- create shortlinksGET /health-- health check
Supports memory (default) and SQLite storage backends.
Docker Compose stack for observability:
cd packages/collector-config
docker compose up- Grafana: http://localhost:3000 (admin/admin)
- Prometheus: http://localhost:9090
- Tempo: http://localhost:3200
- Some AI tools strip referrers. For high-confidence attribution, we support:
- UTM conventions
- Optional redirect/shortlink endpoint (
/r/:code) - Server-side log correlation
To add a new bot or referrer:
- Add the entry to
packages/registry/ai-bots.jsonorpackages/registry/ai-referrers.json - Add a test case in
packages/registry/__tests__/registry.test.ts - Run
npm testto verify - Submit a PR
MIT