- 20/05/2014 at 22:21 #949GazespeakerParticipant
Is it possible to receive both left and right gaze point data in the data stream ? I would like to be able to manage the case of alternating strabismus (in detecting the eye actually looking at the screen and using the gaze data of this eye).
Pierre22/05/2014 at 14:06 #966Jenny [Tobii]Participant
The gaze point data of the individual eyes are available through the Gaze SDK. But even with that I’m not sure it would be possible to do what you want to do.
When a calibration is being done, the eye tracker calibrates per eye, based on the assumption that the eye-gaze of that eye is actually pointing toward a predefined point on the screen. When a person with strabismus is calibrated, only one eye can be assumed to point toward the predefined point. The other eye may be pointing somewhere else, so the calibration for that eye may be loaded with incorrect values that would lead to unpredictable offsets in the calculated gaze point for that eye. In the case of alternating strabismus, this would be even more unpredictable, since different eyes could be used for focusing different points during the calibration process, and no consistent calibration would be obtained for any single eye.
One alternative could be to use the default built-in calibration of the eye tracker instead. This would not give a good accuracy of the gaze point, but since it would use consistent calibrations for each eye, offsets should be quite consistent over the screen.
So, the question is whether the gaze point data from the individual eyes, given these problems getting a good calibration, will be correct enough to be able to decide which eye’s data to use.22/05/2014 at 20:29 #968GazespeakerParticipant
I would use standard calibration, and decide which eye to use based on the eye looking the most frequently inside the screen boundaries (supposing the other eye is looking somewhere else).
I will try to use the Gaze SDK directly then.
Thank you very much for your help.
Pierre07/07/2015 at 11:46 #320025/09/2015 at 00:44 #3537Matthias MParticipant
I just went to your github page. Sorry for my ignorance – but since you are analyzing gaze deviations between the eyes, are you also capable of just analyzing one eye, if the other eye is covered?
Best, Matthias19/10/2016 at 15:54 #5844AnonymousInactive
For a project in my school I need the gaze position of the right and the left eye. Is it possible do get this data?
Best wishes, Tamara from Austria21/10/2016 at 12:36 #5852Grant [Tobii]Keymaster
Whilst individual eye position and gaze data is available in the Tobii Pro range of eye trackers which are designed for scientific usage, this data is not exposed for the Tobii EyeX Tracker.11/04/2017 at 15:08 #6662HayJayParticipant
Hi, first of all I want to state that it is quite sad that you cant get the individual gaze position for each eye, even though it is obviously possible to track individual eyes.
Now to my actual subject. I need the angle between gaze and screen for my application. Would it be a good approach to use the gaze (of the two eyes) and the mean of the two eye positions to create a ray? In other words, is the gaze position from the two eyes derived form the average of the two individual gaze positions?23/04/2017 at 15:48 #6696Grant [Tobii]Keymaster
Hi @janiss, I understand your frustration, but we were not satisfied with the accuracy of individual gaze position so determined the average was the best means of ensuring a fluid and reliable user experience.
That being said, yes you can indeed assume that the gaze position is the average of the two eyes, so your methodology should be fine for the gaze angle.
- You must be logged in to reply to this topic.