thanks for your answer. I’ll try to explain in more detail what I want hoping you can tell me if it can be done.
My goal is simple: I want to calculate strabismus angle in real time while subject is using the computer. Here the strabismus angle is defined as an angle between real axis of the deviating eye (where the eye is actually pointed) and axis that eye would have if it was directed at the same point as dominant eye (like if subject didn’t have strabismus and eye is not deviating).
This could easily be done if I could calculate gaze vectors for both eyes. Problem is that I would need to calibrate eyes separately, like I explained in my previous post.
You said that it is possible to track only one eye and I had the idea of running two different clients, each configured to track different eye. However, in Gaze SDK documentation it is said that calibration data is stored in tracker’s firmware, so I guess this idea is of no use…both clients would still use same calibration data.
Another idea I head is to develop whole calibration routine on my own (using gaze SDK) completely in software. This routine would use raw data accessible through API to calculate gaze positions. But I’m concerned if enough data for this is provided through API? Documentation says that eye positions are locations of the eyeballs, not pupils. But I cannot calculate gaze directions using only eyeball positions, because I also need information about pupil location.
Any thoughts on this? Like I said, my goal is calculating strabismus angle. Any suggestions on how to do this would be highly appreciated!