JawTrack is a real-time jaw motion analysis system that uses computer vision to track and analyze jaw movements. Built with MediaPipe and OpenCV, it provides quantitative measurements for jaw motion assessment.
- Real-time jaw motion tracking
- Video-based analysis
- Quantitative measurements:
- Jaw opening distance
- Lateral deviation
- Movement patterns
- Data visualization
- Assessment reports
- CSV data export
- Python 3.10+
- OpenCV
- MediaPipe
- Gradio
- NumPy
- Pandas
- Matplotlib
- Clone the repository:
git clone https://github.com/yourusername/jawtrack.git
cd jawtrack
- Create a virtual environment:
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
- Install dependencies:
pip install -r requirements.txt
- Start the application:
python app.py
- Open your web browser and navigate to:
http://localhost:7860
- Upload a video or use webcam for real-time analysis
- Install development dependencies:
pip install -r requirements-dev.txt
- Run tests:
pytest tests/
jawtrack/
├── README.md
├── requirements.txt
├── setup.py
├── jawtrack/
│ ├── core/
│ ├── analysis/
│ └── ui/
├── tests/
└── examples/
- Fork the repository
- Create a feature branch
- Commit your changes
- Push to the branch
- Create a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- Your Name - Initial work
- MediaPipe team for face mesh implementation
- OpenCV community
- Gradio team for the web interface framework