This project implements a real-time Rock-Paper-Scissors game using YOLOv8 for hand gesture detection. The game runs in Google Colab and utilizes a webcam for player input.
- Both players must start by showing "Rock".
- Move your hands for 3 seconds. If a player does not move their hand within this time, they lose 1 point.
- Show your move (Rock, Paper, or Scissors).
- YOLOv8 detects the moves and determines the winner.
- Bounding boxes are drawn around detected moves for visualization.
- Open this Colab notebook.
- Replace the
API_KEY
with your own from Roboflow. - Run all cells to start the game.
- Allow camera access in Google Colab.
- The dataset was created using Roboflow.
- It contains labeled images of hands showing Rock, Paper, and Scissors.
- The dataset was made with the help of many students from Shahid Beheshti University (I don't take that university seriously, but their computer engineering students are quite helpful!).
- To use the dataset, you must generate your API key from Roboflow and replace it in the notebook.
- For anyone interested, the notebook for detecting Rock Paper Scissors using mediapipe library is added to this repository.
- https://github.com/Gholamrezadar/yolo11-rock-paper-scissors-detection
- Great code for detecting objects using Yolo in Google Colab with webcam
- https://youtu.be/k2EahPgl0ho?si=U-R9ZZeoq7Fx8G0b
- https://github.com/cvzone/cvzone/blob/master/cvzone/HandTrackingModule.py
- YOLOv8 (Ultralytics) for real-time hand gesture detection
- Google Colab for running the game
- OpenCV for image processing
- Roboflow for dataset management