I have an application written for the Tobii REX that tracks left and right eye (both gaze points actually) independently. I don’t see how I can do it with the EyeX SDK. Is there any ways to detect x,y of the right gaze point and x,y of the left?
If not, is there a way to integrate both SDK into one program – write most of the program leveraging EyeX SDK but that specific part using Gaze SDK?
the EyeX SDK doesn’t provide data streams for the gaze points of the individual eyes even though the REX does track each eye individually. It would be possible to combine both SDKs as a workaround, yes, but it’s not something I would recommend for the long term.
Can you please describe your use case a bit? I’m going to add this as a feature request, and it will get more punch with a real-world use case backing it.
Thanks for your reply. What I was doing with the REX and the Gaze SDK was to determine the eyes vergence through the distance between the 2 gaze points. In this way I was able to determine if the user was focusing on the screen or through the screen.
I’d really like to find a way to do the same (vergence detection) with the EyeX and the EyeX SDK without falling back to the Gaze SDK. Do you think that would be possible?