Skip to content

mdrst/EgocentricTest

 
 

Repository files navigation

Egocentric Prediction of Hand-Object Interaction

This project uses first-person video to predict hand-object contact using a MaskRCNN for object detection and LSTM for trajectory prediction. Also see the project report and project video.

image

The main functions can be run with python run_on_video/run_on_video.py.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 66.2%
  • Cuda 10.7%
  • C 9.9%
  • Jupyter Notebook 8.4%
  • C++ 2.2%
  • Cython 2.2%
  • Other 0.4%