Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add tracking of screen coordinates #34

Open
wants to merge 20 commits into
base: master
Choose a base branch
from
Open

Conversation

ritko
Copy link

@ritko ritko commented Dec 1, 2019

Added mapping of gaze ratios to screen coordinates, so that it now tracks eye point of gaze (EPOG) on the screen.

This is done by asking the user to fixate on calibration points with predetermined coordinates. These points, plus knowledge of screen size, are used to map subsequent gaze ratios to screen coordinates. Subsequently, a short test is run, where user fixates on test points, and the EPOG error is calculated.

Mainly added gaze_calibration.py file and completely redid the previous example.py file

@okridgway
Copy link

This would be super cool, except after installing everything and trying to run the example file, it returned an "Abort trap: 6" error.

@okridgway
Copy link

It would also be nice to be able to do a calibration via webcam and then do gaze detection from a different video source.

@okridgway
Copy link

okridgway commented May 20, 2020

Or even just do screen coordinates without calibration. That would be ideal.
You might check this out: https://stackoverflow.com/a/52963879/11792607
(I'm a noob and don't know how to implement this)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants