Skip to content

Kibibit/git-ai-metrics

Repository files navigation

git-ai-metrics

npm version License: MIT Node.js Version TypeScript

Turn raw git-ai attribution data into beautiful dashboards, half-life analytics, and organization-wide AI usage insightsβ€”all with a single command.

πŸ“Š Analyze β€’ πŸ“ˆ Track β€’ 🏒 Scale - Understand how AI tools impact your development workflow and codebase evolution.

✨ Key Features

  • πŸ“Š AI vs Human Code Analysis - Quantify AI adoption with detailed breakdowns of code authorship
  • ⏱️ AI Line Half-Life Analytics - Measure how long AI-written code survives in your codebase
  • πŸ‘₯ Human-AI Collaboration Insights - See which developers use which AI tools and models
  • πŸ“ˆ Historical Trends - Track AI adoption over time with automated snapshot generation
  • 🏒 Organization Dashboards - Aggregate metrics across multiple repositories
  • πŸŒ™ Modern UI - Beautiful, responsive dashboards with light/dark themes
  • πŸš€ GitHub Integration - Automated reporting with GitHub Actions workflows

πŸ“‹ Table of Contents

πŸš€ Quick Start

One-Line Analysis

npx git-ai-metrics repo

That's it! Run this in any git repository tracked by git-ai to generate a comprehensive AI metrics dashboard.

What You'll Get

  • Interactive HTML Dashboard at ./ai-metrics/index.html
  • JSON Data Exports for further analysis
  • Historical Tracking across multiple runs
  • Beautiful Visualizations of AI adoption patterns

πŸ“¦ Installation

Global Installation (Recommended)

npm install -g @kibibit/git-ai-metrics

Run Without Installing

npx @kibibit/git-ai-metrics repo

Development Setup

git clone https://github.com/kibibit/git-ai-metrics.git
cd git-ai-metrics
npm install
npm run build
npm link  # Optional: link globally for development

System Requirements

  • Node.js: β‰₯18.0.0
  • git-ai: Installation guide
  • Git: Any recent version (for repository analysis)

πŸ’‘ Usage Examples

Single Repository Analysis

# Navigate to any git repository with git-ai tracking
cd your-project

# Generate AI metrics dashboard
git-ai-metrics repo

# View the results
open ai-metrics/index.html

Historical Trend Analysis

# After running daily/weekly, generate timeline view
git-ai-metrics repo-history --input ./ai-metrics

# Force full recalculation (slower but comprehensive)
git-ai-metrics repo --no-incremental --verbose

Organization-Wide Aggregation

# Aggregate metrics from multiple repositories
git-ai-metrics org --input ./repos --name "My Company"

# Custom output location
git-ai-metrics org --input ./team-repos --output ./company-metrics --name "Engineering"

Custom Date Reporting

# Generate report for specific date
git-ai-metrics repo --date 2025-01-15

# Custom output directory
git-ai-metrics repo --output ./reports/ai-insights

πŸ–₯️ CLI Reference

git-ai-metrics repo

Generate AI metrics for a single git repository.

git-ai-metrics repo [options]

Options

Option Description Default
-o, --output <path> Output directory for reports "./ai-metrics"
-d, --date <date> Report date (YYYY-MM-DD format) Today
--no-incremental Force full recalculation false
-v, --verbose Show detailed progress false

Examples

# Basic usage
git-ai-metrics repo

# Full recalculation with verbose output
git-ai-metrics repo --no-incremental --verbose

# Custom output directory and date
git-ai-metrics repo --output ./reports/ai-metrics --date 2025-01-15

git-ai-metrics repo-history

Generate historical trend analysis from existing metrics snapshots.

git-ai-metrics repo-history [options]

Options

Option Description Default
-i, --input <path> Input ai-metrics directory "./ai-metrics"
-o, --output <path> Output directory Same as input
-v, --verbose Show detailed progress false

Example

git-ai-metrics repo-history --input ./ai-metrics --verbose

git-ai-metrics org

Aggregate AI metrics across multiple repositories for organization-wide insights.

git-ai-metrics org [options]

Options

Option Description Default
-i, --input <path> Parent folder with repos "."
-o, --output <path> Output directory "./org-metrics"
-n, --name <name> Organization name "Organization"
-v, --verbose Show detailed progress false

Expected Directory Structure

workspace/
β”œβ”€β”€ repo-frontend/
β”‚   └── ai-metrics/
β”‚       β”œβ”€β”€ 2025-01-01/
β”‚       β”‚   β”œβ”€β”€ data.json
β”‚       β”‚   └── report.html
β”‚       └── 2025-01-08/
β”‚           └── ...
β”œβ”€β”€ repo-backend/
β”‚   └── ai-metrics/
β”‚       └── ...
└── repo-mobile/
    └── ai-metrics/
        └── ...

Example

git-ai-metrics org --input ./team-repos --name "Engineering Team" --verbose

Output Structure

After running git-ai-metrics repo, you'll get:

ai-metrics/
β”œβ”€β”€ 2025-05-28/
β”‚   β”œβ”€β”€ data.json        # Full metrics data
β”‚   β”œβ”€β”€ summary.json     # Quick stats for aggregation
β”‚   └── report.html      # Interactive dashboard
β”œβ”€β”€ 2025-06-01/
β”‚   └── ...
β”œβ”€β”€ history.html         # Timeline view across all dates
β”œβ”€β”€ history.json         # Historical data
└── index.html           # Redirects to latest report

πŸ“Š Metrics Explained

πŸ€– AI vs Human Code Analysis

Understand the balance between AI-generated and human-written code:

  • AI Lines: Code lines directly attributed to AI tools
  • Human Lines: Code written or significantly modified by humans
  • Mixed Lines: AI-generated code that was edited by humans before commit
  • AI Acceptance Rate: Percentage of AI-generated code committed without changes

πŸ’‘ Tip: High acceptance rates may indicate effective AI tool usage, while low rates suggest significant human refinement.

⏱️ AI Code Half-Life

Half-life measures how long AI-written code survives in your codebase before being modified or deleted. This is crucial for understanding code quality and maintainability.

Metric Meaning
Median 50% of AI lines survive at least this many days
Mean Average survival time across all AI lines
P25/P75 25th and 75th percentile survival times

Interpretation:

  • Long half-life (60+ days): AI code is stable and requires minimal maintenance
  • Short half-life (< 30 days): AI code may need significant refactoring or have quality issues
  • "0 days" or "N/A": Usually indicates recent commits or insufficient tracking history

🎯 AI Tool Quality Scores

Quality scores (0-100) help compare different AI tools and prompting strategies:

Scoring Formula:

  • 60% - Half-life performance (normalized against best-performing tool)
  • 40% - AI acceptance rate

What makes a high score:

  • Long half-life (stable, maintainable code)
  • High acceptance rate (effective prompting)
  • Consistent performance across different code types

πŸ“ˆ Human-AI Collaboration Insights

Track which developers use which AI tools:

  • Tool Adoption: Which AI assistants are most popular
  • Developer Preferences: Individual usage patterns
  • Team Distribution: How AI usage spreads across your team
  • Prompt Effectiveness: Which prompting strategies work best

πŸ”§ GitHub Actions Integration

Automate AI metrics generation and publishing with GitHub Actions. All workflows are available in .github/workflows/.

πŸš€ Quick Setup for Single Repository

  1. Enable GitHub Pages in your repository settings (source: "GitHub Actions")
  2. Copy the workflow file to your repository:
    curl -fsSL https://raw.githubusercontent.com/kibibit/git-ai-metrics/main/.github/workflows/repo-metrics.yaml \
      -o .github/workflows/ai-metrics.yaml
  3. Commit and push - the workflow will run automatically

πŸ“Š Repository-Level Automation

The included workflow (repo-metrics.yaml) provides:

  • Automatic execution on pushes to main and weekly schedule
  • GitHub Pages deployment for public dashboards
  • Incremental processing to optimize performance
  • Historical tracking across multiple runs
  • Workflow dispatch for manual triggers

Key Features:

  • Runs on push to main branch
  • Weekly scheduled execution (Sundays at midnight)
  • Manual trigger via GitHub UI
  • Automatic deployment to GitHub Pages
  • Preserves existing metrics for incremental updates

🏒 Organization-Level Aggregation

For aggregating metrics across multiple repositories:

  1. Create a dedicated "metrics" repository
  2. Use the org workflow (org-metrics.yaml)
  3. Configure repository list via workflow inputs or environment variables
  4. Set up GitHub Pages for organization-wide dashboards

Workflow automatically:

  • Fetches metrics from each team's repositories
  • Aggregates data across all projects
  • Generates organization-wide insights
  • Publishes comprehensive dashboards

πŸ“‹ Workflow Configuration

Environment Variables

env:
  DEFAULT_REPOS: |
    kibibit/frontend
    kibibit/backend
    kibibit/mobile

Manual Triggers

Use workflow dispatch to override repository list:

workflow_dispatch:
  inputs:
    repos:
      description: 'Comma-separated list of repos (owner/repo)'
      required: false
      default: ''

πŸ”’ Security & Permissions

Required permissions for automated workflows:

permissions:
  contents: write      # Commit metrics to repository
  pages: write         # Deploy to GitHub Pages
  id-token: write      # GitHub Pages deployment

πŸ“ˆ Monitoring & Alerts

Workflows include:

  • GitHub Actions summaries with key metrics
  • Automatic status reporting in PRs/issues
  • Error handling with detailed logs
  • Performance metrics for workflow optimization

🎨 Customization

Dashboard Theming

The HTML dashboards use CSS custom properties for easy theming. Override these variables to match your brand:

:root {
  /* Color scheme */
  --color-ai: #6366f1;         /* AI-generated code color */
  --color-human: #10b981;      /* Human-written code color */
  --color-mixed: #f59e0b;      /* Mixed authorship color */
  --color-accent: #6366f1;     /* Primary accent color */

  /* Background colors */
  --bg-primary: #fafafa;       /* Main background */
  --bg-secondary: #ffffff;     /* Card/component backgrounds */
  --bg-tertiary: #f4f4f5;      /* Subtle backgrounds */

  /* Text colors */
  --text-primary: #18181b;     /* Primary text */
  --text-secondary: #52525b;   /* Secondary text */
  --text-muted: #a1a1aa;       /* Muted text */

  /* UI elements */
  --border-color: #e4e4e7;     /* Borders and dividers */
  --shadow-sm: 0 1px 2px rgba(0,0,0,0.05);
  --shadow-md: 0 4px 6px -1px rgba(0,0,0,0.1);
  --radius: 12px;             /* Border radius */
  --radius-sm: 8px;
}

/* Dark theme */
[data-theme="dark"] {
  --bg-primary: #09090b;
  --bg-secondary: #18181b;
  --bg-tertiary: #27272a;
  --text-primary: #fafafa;
  --text-secondary: #a1a1aa;
  --text-muted: #71717a;
  --border-color: #3f3f46;
  --shadow-sm: 0 1px 2px rgba(0,0,0,0.3);
  --shadow-md: 0 4px 6px -1px rgba(0,0,0,0.4);
}

Custom CSS Integration

Add your custom styles by including them after the dashboard's default styles:

<link rel="stylesheet" href="path/to/dashboard.css">
<link rel="stylesheet" href="path/to/your-custom-theme.css">

Organization Branding

For organization-wide dashboards, consider:

  • Company color scheme alignment
  • Logo integration in headers
  • Custom metric thresholds and alerts
  • Branded report templates

❓ Troubleshooting

πŸ” "No git-ai data found"

Problem: The tool reports no AI attribution data in your repository.

Solutions:

  1. Verify git-ai installation:
    git-ai --version
  2. Check for AI-attributed commits:
    git notes --ref=git-ai list | head -10
  3. Ensure you're in a git repository:
    git status
  4. Check recent commits for AI notes:
    git log --oneline --grep="git-ai" -10

🐌 Slow Performance on Large Repositories

Problem: Analysis takes too long on big codebases.

Solutions:

  • Use incremental mode (default behavior):
    git-ai-metrics repo  # Only processes new commits
  • Force full recalculation only when needed:
    git-ai-metrics repo --no-incremental  # Slower, comprehensive
  • Enable verbose output to monitor progress:
    git-ai-metrics repo --verbose

πŸ“Š Half-Life Shows 0 or N/A

Problem: Half-life metrics are unavailable or show zero values.

Requirements for half-life calculation:

  • βœ… AI-attributed lines in the codebase
  • βœ… Multiple days of commit history
  • βœ… git blame data available for tracked files
  • βœ… Lines that have survived long enough to measure

Common causes:

  • New repository: No historical data yet
  • Recent AI adoption: All AI code committed today
  • Shallow clones: Missing full git history

Solutions:

  • Wait for more commits over several days
  • Ensure full git history is available
  • Run analysis on repositories with established AI usage

πŸ”§ GitHub Actions Issues

Workflow fails with permission errors:

  • Ensure repository has GitHub Pages enabled
  • Check that required permissions are granted
  • Verify branch protection rules allow workflow execution

Metrics not updating:

  • Check if incremental mode is preserving old data
  • Verify git-ai is properly installed in CI environment
  • Ensure workflow has access to full git history

πŸ“ˆ Data Export Issues

JSON files are empty or malformed:

  • Check console output for collection errors
  • Verify git-ai data integrity
  • Run with --verbose flag for detailed logging

HTML dashboards not rendering:

  • Ensure modern browser (Chrome 90+, Firefox 88+, Safari 14+)
  • Check browser console for JavaScript errors
  • Verify all asset files are accessible

🀝 Contributing

We welcome contributions from the community! Here's how you can help:

πŸš€ Ways to Contribute

  • πŸ› Bug Reports: Found an issue? Open a GitHub issue
  • ✨ Feature Requests: Have an idea? Start a discussion
  • πŸ”§ Code Contributions: See our contributing guidelines
  • πŸ“– Documentation: Help improve docs, examples, or tutorials
  • πŸ§ͺ Testing: Report bugs or help test new features

πŸ› οΈ Development Setup

# Clone the repository
git clone https://github.com/kibibit/git-ai-metrics.git
cd git-ai-metrics

# Install dependencies
npm install

# Build the project
npm run build

# Run tests
npm test

# Development mode with watching
npm run dev

πŸ“‹ Development Workflow

  1. Fork the repository
  2. Create a feature branch: git checkout -b feature/amazing-feature
  3. Make your changes and add tests
  4. Ensure all tests pass: npm run test
  5. Update documentation as needed
  6. Commit your changes: git commit -m 'Add amazing feature'
  7. Push to your branch: git push origin feature/amazing-feature
  8. Open a Pull Request

πŸ“Š Code Quality

  • Follow TypeScript best practices
  • Add unit tests for new features
  • Ensure ESLint passes: npm run lint
  • Update README for any new features or CLI options

πŸ’¬ Community

  • Discussions: Join GitHub Discussions for questions and ideas
  • Issues: Use GitHub Issues for bugs and feature requests
  • Discord/Slack: Check our community channels for real-time help

πŸ“„ License

MIT License - see the LICENSE file for details.

Copyright (c) 2025 git-ai-metrics contributors.


Made with ❀️ for the AI-powered development community
Transform your AI collaboration data into actionable insights

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published