Turn raw git-ai attribution data into beautiful dashboards, half-life analytics, and organization-wide AI usage insightsβall with a single command.
π Analyze β’ π Track β’ π’ Scale - Understand how AI tools impact your development workflow and codebase evolution.
- π AI vs Human Code Analysis - Quantify AI adoption with detailed breakdowns of code authorship
- β±οΈ AI Line Half-Life Analytics - Measure how long AI-written code survives in your codebase
- π₯ Human-AI Collaboration Insights - See which developers use which AI tools and models
- π Historical Trends - Track AI adoption over time with automated snapshot generation
- π’ Organization Dashboards - Aggregate metrics across multiple repositories
- π Modern UI - Beautiful, responsive dashboards with light/dark themes
- π GitHub Integration - Automated reporting with GitHub Actions workflows
- π Quick Start
- π¦ Installation
- π‘ Usage Examples
- π Metrics Explained
- π₯οΈ CLI Reference
- π§ GitHub Actions Integration
- π¨ Customization
- β Troubleshooting
- π€ Contributing
- π License
npx git-ai-metrics repoThat's it! Run this in any git repository tracked by git-ai to generate a comprehensive AI metrics dashboard.
- Interactive HTML Dashboard at
./ai-metrics/index.html - JSON Data Exports for further analysis
- Historical Tracking across multiple runs
- Beautiful Visualizations of AI adoption patterns
npm install -g @kibibit/git-ai-metricsnpx @kibibit/git-ai-metrics repogit clone https://github.com/kibibit/git-ai-metrics.git
cd git-ai-metrics
npm install
npm run build
npm link # Optional: link globally for development- Node.js: β₯18.0.0
- git-ai: Installation guide
- Git: Any recent version (for repository analysis)
# Navigate to any git repository with git-ai tracking
cd your-project
# Generate AI metrics dashboard
git-ai-metrics repo
# View the results
open ai-metrics/index.html# After running daily/weekly, generate timeline view
git-ai-metrics repo-history --input ./ai-metrics
# Force full recalculation (slower but comprehensive)
git-ai-metrics repo --no-incremental --verbose# Aggregate metrics from multiple repositories
git-ai-metrics org --input ./repos --name "My Company"
# Custom output location
git-ai-metrics org --input ./team-repos --output ./company-metrics --name "Engineering"# Generate report for specific date
git-ai-metrics repo --date 2025-01-15
# Custom output directory
git-ai-metrics repo --output ./reports/ai-insightsGenerate AI metrics for a single git repository.
git-ai-metrics repo [options]| Option | Description | Default |
|---|---|---|
-o, --output <path> |
Output directory for reports | "./ai-metrics" |
-d, --date <date> |
Report date (YYYY-MM-DD format) | Today |
--no-incremental |
Force full recalculation | false |
-v, --verbose |
Show detailed progress | false |
# Basic usage
git-ai-metrics repo
# Full recalculation with verbose output
git-ai-metrics repo --no-incremental --verbose
# Custom output directory and date
git-ai-metrics repo --output ./reports/ai-metrics --date 2025-01-15Generate historical trend analysis from existing metrics snapshots.
git-ai-metrics repo-history [options]| Option | Description | Default |
|---|---|---|
-i, --input <path> |
Input ai-metrics directory | "./ai-metrics" |
-o, --output <path> |
Output directory | Same as input |
-v, --verbose |
Show detailed progress | false |
git-ai-metrics repo-history --input ./ai-metrics --verboseAggregate AI metrics across multiple repositories for organization-wide insights.
git-ai-metrics org [options]| Option | Description | Default |
|---|---|---|
-i, --input <path> |
Parent folder with repos | "." |
-o, --output <path> |
Output directory | "./org-metrics" |
-n, --name <name> |
Organization name | "Organization" |
-v, --verbose |
Show detailed progress | false |
workspace/
βββ repo-frontend/
β βββ ai-metrics/
β βββ 2025-01-01/
β β βββ data.json
β β βββ report.html
β βββ 2025-01-08/
β βββ ...
βββ repo-backend/
β βββ ai-metrics/
β βββ ...
βββ repo-mobile/
βββ ai-metrics/
βββ ...
git-ai-metrics org --input ./team-repos --name "Engineering Team" --verboseAfter running git-ai-metrics repo, you'll get:
ai-metrics/
βββ 2025-05-28/
β βββ data.json # Full metrics data
β βββ summary.json # Quick stats for aggregation
β βββ report.html # Interactive dashboard
βββ 2025-06-01/
β βββ ...
βββ history.html # Timeline view across all dates
βββ history.json # Historical data
βββ index.html # Redirects to latest report
Understand the balance between AI-generated and human-written code:
- AI Lines: Code lines directly attributed to AI tools
- Human Lines: Code written or significantly modified by humans
- Mixed Lines: AI-generated code that was edited by humans before commit
- AI Acceptance Rate: Percentage of AI-generated code committed without changes
π‘ Tip: High acceptance rates may indicate effective AI tool usage, while low rates suggest significant human refinement.
Half-life measures how long AI-written code survives in your codebase before being modified or deleted. This is crucial for understanding code quality and maintainability.
| Metric | Meaning |
|---|---|
| Median | 50% of AI lines survive at least this many days |
| Mean | Average survival time across all AI lines |
| P25/P75 | 25th and 75th percentile survival times |
Interpretation:
- Long half-life (60+ days): AI code is stable and requires minimal maintenance
- Short half-life (< 30 days): AI code may need significant refactoring or have quality issues
- "0 days" or "N/A": Usually indicates recent commits or insufficient tracking history
Quality scores (0-100) help compare different AI tools and prompting strategies:
Scoring Formula:
- 60% - Half-life performance (normalized against best-performing tool)
- 40% - AI acceptance rate
What makes a high score:
- Long half-life (stable, maintainable code)
- High acceptance rate (effective prompting)
- Consistent performance across different code types
Track which developers use which AI tools:
- Tool Adoption: Which AI assistants are most popular
- Developer Preferences: Individual usage patterns
- Team Distribution: How AI usage spreads across your team
- Prompt Effectiveness: Which prompting strategies work best
Automate AI metrics generation and publishing with GitHub Actions. All workflows are available in .github/workflows/.
- Enable GitHub Pages in your repository settings (source: "GitHub Actions")
- Copy the workflow file to your repository:
curl -fsSL https://raw.githubusercontent.com/kibibit/git-ai-metrics/main/.github/workflows/repo-metrics.yaml \ -o .github/workflows/ai-metrics.yaml
- Commit and push - the workflow will run automatically
The included workflow (repo-metrics.yaml) provides:
- Automatic execution on pushes to main and weekly schedule
- GitHub Pages deployment for public dashboards
- Incremental processing to optimize performance
- Historical tracking across multiple runs
- Workflow dispatch for manual triggers
Key Features:
- Runs on
pushto main branch - Weekly scheduled execution (Sundays at midnight)
- Manual trigger via GitHub UI
- Automatic deployment to GitHub Pages
- Preserves existing metrics for incremental updates
For aggregating metrics across multiple repositories:
- Create a dedicated "metrics" repository
- Use the org workflow (
org-metrics.yaml) - Configure repository list via workflow inputs or environment variables
- Set up GitHub Pages for organization-wide dashboards
Workflow automatically:
- Fetches metrics from each team's repositories
- Aggregates data across all projects
- Generates organization-wide insights
- Publishes comprehensive dashboards
env:
DEFAULT_REPOS: |
kibibit/frontend
kibibit/backend
kibibit/mobileUse workflow dispatch to override repository list:
workflow_dispatch:
inputs:
repos:
description: 'Comma-separated list of repos (owner/repo)'
required: false
default: ''Required permissions for automated workflows:
permissions:
contents: write # Commit metrics to repository
pages: write # Deploy to GitHub Pages
id-token: write # GitHub Pages deploymentWorkflows include:
- GitHub Actions summaries with key metrics
- Automatic status reporting in PRs/issues
- Error handling with detailed logs
- Performance metrics for workflow optimization
The HTML dashboards use CSS custom properties for easy theming. Override these variables to match your brand:
:root {
/* Color scheme */
--color-ai: #6366f1; /* AI-generated code color */
--color-human: #10b981; /* Human-written code color */
--color-mixed: #f59e0b; /* Mixed authorship color */
--color-accent: #6366f1; /* Primary accent color */
/* Background colors */
--bg-primary: #fafafa; /* Main background */
--bg-secondary: #ffffff; /* Card/component backgrounds */
--bg-tertiary: #f4f4f5; /* Subtle backgrounds */
/* Text colors */
--text-primary: #18181b; /* Primary text */
--text-secondary: #52525b; /* Secondary text */
--text-muted: #a1a1aa; /* Muted text */
/* UI elements */
--border-color: #e4e4e7; /* Borders and dividers */
--shadow-sm: 0 1px 2px rgba(0,0,0,0.05);
--shadow-md: 0 4px 6px -1px rgba(0,0,0,0.1);
--radius: 12px; /* Border radius */
--radius-sm: 8px;
}
/* Dark theme */
[data-theme="dark"] {
--bg-primary: #09090b;
--bg-secondary: #18181b;
--bg-tertiary: #27272a;
--text-primary: #fafafa;
--text-secondary: #a1a1aa;
--text-muted: #71717a;
--border-color: #3f3f46;
--shadow-sm: 0 1px 2px rgba(0,0,0,0.3);
--shadow-md: 0 4px 6px -1px rgba(0,0,0,0.4);
}Add your custom styles by including them after the dashboard's default styles:
<link rel="stylesheet" href="path/to/dashboard.css">
<link rel="stylesheet" href="path/to/your-custom-theme.css">For organization-wide dashboards, consider:
- Company color scheme alignment
- Logo integration in headers
- Custom metric thresholds and alerts
- Branded report templates
Problem: The tool reports no AI attribution data in your repository.
Solutions:
- Verify git-ai installation:
git-ai --version
- Check for AI-attributed commits:
git notes --ref=git-ai list | head -10 - Ensure you're in a git repository:
git status
- Check recent commits for AI notes:
git log --oneline --grep="git-ai" -10
Problem: Analysis takes too long on big codebases.
Solutions:
- Use incremental mode (default behavior):
git-ai-metrics repo # Only processes new commits - Force full recalculation only when needed:
git-ai-metrics repo --no-incremental # Slower, comprehensive - Enable verbose output to monitor progress:
git-ai-metrics repo --verbose
Problem: Half-life metrics are unavailable or show zero values.
Requirements for half-life calculation:
- β AI-attributed lines in the codebase
- β Multiple days of commit history
- β
git blamedata available for tracked files - β Lines that have survived long enough to measure
Common causes:
- New repository: No historical data yet
- Recent AI adoption: All AI code committed today
- Shallow clones: Missing full git history
Solutions:
- Wait for more commits over several days
- Ensure full git history is available
- Run analysis on repositories with established AI usage
Workflow fails with permission errors:
- Ensure repository has GitHub Pages enabled
- Check that required permissions are granted
- Verify branch protection rules allow workflow execution
Metrics not updating:
- Check if incremental mode is preserving old data
- Verify git-ai is properly installed in CI environment
- Ensure workflow has access to full git history
JSON files are empty or malformed:
- Check console output for collection errors
- Verify git-ai data integrity
- Run with
--verboseflag for detailed logging
HTML dashboards not rendering:
- Ensure modern browser (Chrome 90+, Firefox 88+, Safari 14+)
- Check browser console for JavaScript errors
- Verify all asset files are accessible
We welcome contributions from the community! Here's how you can help:
- π Bug Reports: Found an issue? Open a GitHub issue
- β¨ Feature Requests: Have an idea? Start a discussion
- π§ Code Contributions: See our contributing guidelines
- π Documentation: Help improve docs, examples, or tutorials
- π§ͺ Testing: Report bugs or help test new features
# Clone the repository
git clone https://github.com/kibibit/git-ai-metrics.git
cd git-ai-metrics
# Install dependencies
npm install
# Build the project
npm run build
# Run tests
npm test
# Development mode with watching
npm run dev- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature - Make your changes and add tests
- Ensure all tests pass:
npm run test - Update documentation as needed
- Commit your changes:
git commit -m 'Add amazing feature' - Push to your branch:
git push origin feature/amazing-feature - Open a Pull Request
- Follow TypeScript best practices
- Add unit tests for new features
- Ensure ESLint passes:
npm run lint - Update README for any new features or CLI options
- Discussions: Join GitHub Discussions for questions and ideas
- Issues: Use GitHub Issues for bugs and feature requests
- Discord/Slack: Check our community channels for real-time help
MIT License - see the LICENSE file for details.
Copyright (c) 2025 git-ai-metrics contributors.
Made with β€οΈ for the AI-powered development community
Transform your AI collaboration data into actionable insights