Skip to content

Latest commit

 

History

History
7 lines (4 loc) · 488 Bytes

README.md

File metadata and controls

7 lines (4 loc) · 488 Bytes

Egocentric Prediction of Hand-Object Interaction

This project uses first-person video to predict hand-object contact using a MaskRCNN for object detection and LSTM for trajectory prediction. Also see the project report and project video.

image

The main functions can be run with python run_on_video/run_on_video.py.