Skip to content

Deep learning pipeline for player detection and analytics using YOLO, DeepSORT, and custom metrics covers model training, tracking, feature extraction, insights, and visualization for sports data.

Notifications You must be signed in to change notification settings

Atheeth24091998/Player-tracking-and-performance-analysis

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Table of Contents

Overview


This repository focuses on player tracking and insights extraction from video, expanding on a previously developed ball-tracking system. We utilize YOLOv11 (You Only Look Once, version 11) for player tracking within the Adidas Igloo environment.

Key Features extracted from the player video are :

  • Player Tracking: Detect and track player movements using YOLOv11 [1].
  • Ball Analysis: Calculate maximum speed, average speed, and total distance traveled.
  • Player Insights: Measure player speed, movement patterns, and total distance covered.
  • Region-Based Analysis: Determine the percentage of time spent in different igloo zones.
  • Heatmaps: Generate 2D and 3D heatmaps to visualize movement density.

Data Collection and annotation

  • The images are extracted from the video at a rate of 5 frames per second, maintaining a resolution of 1920x1080 for high-quality analysis.
  • The extracted frames are manually annotated using the Labelme tool, with the annotated player images stored in the Player_Annotation_Images folder.

Model selection and Training

  • The annotated images are used to train the YOLO model (11m version), focusing on detecting player positions and movements.
  • The model was trained for a total of 200 epochs, with an initial learning rate of lr0 = 0.001 to optimize performance and accuracy.

Conclusions from Results:

  • The confusion matrix we can observe that the model performs exceptionally well in classifying the "player" class, with minimal misclassification. The high values suggest that the model reliably distinguishes players from the background, demonstrating strong accuracy and robustness in detection.
  • Box Loss, Classification Loss, and Distribution Focal Loss (DFL) are all decreasing steadily, which suggests that the model is learning effectively.
  • Precision and Recall are nearly 1.0, which means the model detects almost all players while avoiding false positives.
  • mAP@50 and mAP@50-95 are increasing and stabilizing, indicating that there is a strong generalization behaviour.

Details about the key features implementations

Player Tracking

1. YOLO (You Only Look Once) - Object Detection

  • Approach: YOLO is a deep learning-based object detection model that processes the entire image in a single pass, making it extremely fast and efficient. It detects players by analyzing the frame and classifying bounding boxes around them.
  • Implementation: In the code, YOLO is used as the primary detector, running on each video frame with a confidence threshold of 0.7 to filter out weak detections. It provides initial player positions, which are then passed to the tracking system.

2. DeepSORT - Object Tracking

  • Approach : DeepSORT (Simple Online and Realtime Tracker) [2] is a tracking algorithm that uses motion information and appearance features to track objects across frames. It ensures the same player is consistently identified even when moving.
  • Implementation : After YOLO detects a player, DeepSORT refines the tracking by associating detections with previously tracked objects using a combination of IoU (Intersection over Union) and a deep feature similarity network. It helps filter out incorrect or unreliable YOLO detections.

3. Kalman Filter - Prediction Model

  • Approach : The Kalman Filter is a mathematical model that predicts the next position of an object based on its previous state (position, velocity). It smooths noisy detections and handles occlusion when YOLO fails to detect the player.
  • Implementation : If YOLO and DeepSORT lose track of the player, the Kalman Filter predicts their next position based on previous movement. Once a new YOLO detection is available, it updates the prediction to improve accuracy. This ensures continuous tracking even in difficult scenarios.

Ball Analysis

1. Ball Maximum Speed

  • Unit : Meters per second (m/s)
  • Calculation : The speed of the ball is measured over a 30-frame window, and the highest recorded speed is stored as the maximum speed.
  • Usage : This metric helps in understanding the peak velocity reached by the ball during gameplay.

2. Ball Average Speed

  • Unit : Meters per second (m/s)
  • Calculation : The total speed accumulated across all frames is divided by the total number of frames processed to obtain the average speed.
  • Usage : Provides insight into the general pace of the ball throughout the session.

3. Ball Total Distance Traveled

  • Unit : Meters
  • Calculation : The total distance is computed by summing the frame-to-frame Euclidean distances between ball positions across all frames.
  • Usage : This helps in analyzing the movement pattern and coverage of the ball within the playing area.
  • Testing : To test how far the ball travels, we used a part of the video where the ball moved from the center of the igloo to the goal, which is 3 meters. The algorithm's prediction was very close to this reference distance.

Player Insights

1. Maximum Player Speed

  • Units: Speed is measured in meters per second (m/s).
  • Approach: The function checks for continuous player detection over 30 frames (~1 second at 30 FPS). It calculates the total distance traveled over these frames using the Euclidean distance formula. The largest speed value recorded within any 30-frame segment is considered the maximum speed.

2. Player Average Speed

  • Units : Speed is measured in meters per second (m/s).
  • Approach : Similar to max speed, the function calculates distance over 30-frame intervals where the player is continuously detected. It sums up all valid speed values across the video and computes the average speed.

3. Total Distance Calculation

  • Units : Distance is measured in meters (m).
  • Approach : The function tracks the player's movement from frame to frame. It calculates the Euclidean distance between consecutive frames and sums up all distances. A scale factor (0.01645) converts pixel distances into real-world meters.

Region-Based Analysis

  • Regions are divided based on equal radius intervals of 45.5 pixels each, starting from the center and extending outward to 182 pixels. This ensures that all four regions have the same width in terms of radius

Heatmaps

  • The function generates a 2D heatmap of player movement within a circular area, divided into four concentric regions. It normalizes movement density, applies a custom colormap to represent heat intensity, and visualizes player distribution within the defined zones.
  • The function generates a 3D heatmap of player movement within a circular area, using a surface plot to visualize movement density. It overlays three concentric circular regions to analyze player distribution and calculates percentage-based movement zones.

Example Outputs

  • Insights_sample.xlsx in example_outputs/ shows a sample structure for analytic output (no real player data).

  • Detection_samples/ and heatmaps/ contain anonymized images demonstrating bounding box/heatmap results.
  • ReadMe_Pictures/ provides metrics plots for model evaluation (e.g., confusion matrix, loss convergence).

All outputs are synthetic or anonymized and do not contain confidential project data.

Data Privacy Notice

This repository contains only source code written by the author as part of the Adidas-FAU research collaboration. No raw data, annotated images, or trained model weights from Adidas or FAU are included, in compliance with strict privacy and partnership agreements.

All example outputs (Excel, images, heatmaps) are synthetic, anonymized, or generated from test/public data for demonstration purposes only.

References

[1] Joseph Redmon, Santosh Divvala, Ross Girshick, Ali Farhadi. You Only Look Once: Unified, Real-Time Object Detection. arXiv preprint arXiv:1506.02640, 2016. https://arxiv.org/abs/1506.02640

GitHub repository link : (https://github.com/ultralytics/ultralytics)

[2] Nicolai Wojke, Alex Bewley, Dietrich Paulus. Simple Online and Realtime Tracking with a Deep Association Metric. 2017 IEEE International Conference on Image Processing (ICIP). https://arxiv.org/abs/1703.07402

About

Deep learning pipeline for player detection and analytics using YOLO, DeepSORT, and custom metrics covers model training, tracking, feature extraction, insights, and visualization for sports data.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published