Home Forums Feature Requests Calibrating eyes individually

Viewing 11 posts - 1 through 11 (of 11 total)
  • Author
    Posts
  • #661
    Pablo
    Participant

    Hi,

    I would like to develop a specific application aimed at users with strabismus. I’ve studied developer documentation and I have some questions that I’d like to be answered before ordering eyeX.

    For strabismus it is important to track eyes individually and your documentation says that data contains data for both eyes. However, I’m concerned that calibration process assumes that both eyes have a some gaze point, and for strabismus this is not true by definition.

    This problem could be solved simply by calibrating eyes separately. Calibration can be done for one eye while the other is covered and the process would be repeated for other eye. While the system is calculating gaze positions it would use respective calibration data for each eye.

    Will something like this be possible? Maybe it can be done by using two separate processes? If we can configure them to track one eye (while ignoring other) and we use one for left and other for right?

    cheers,
    pablo

    #683
    Jenny [Tobii]
    Participant

    Hi Pablo,

    Our general approach for users with strabismus is to track only one eye with the eye tracker. In EyeX this is done when setting up a user profile, where it is possible to select the option to track only the right or the left eye, whichever is the dominant eye. The calibration and eye tracking is then done using only information from the selected eye.

    Having two simultaneous calibrations as you describe is not possible with EyeX.

    /Jenny

    #697
    Pablo
    Participant

    Hi Jenny,

    thanks for your answer. I’ll try to explain in more detail what I want hoping you can tell me if it can be done.

    My goal is simple: I want to calculate strabismus angle in real time while subject is using the computer. Here the strabismus angle is defined as an angle between real axis of the deviating eye (where the eye is actually pointed) and axis that eye would have if it was directed at the same point as dominant eye (like if subject didn’t have strabismus and eye is not deviating).

    This could easily be done if I could calculate gaze vectors for both eyes. Problem is that I would need to calibrate eyes separately, like I explained in my previous post.

    You said that it is possible to track only one eye and I had the idea of running two different clients, each configured to track different eye. However, in Gaze SDK documentation it is said that calibration data is stored in tracker’s firmware, so I guess this idea is of no use…both clients would still use same calibration data.

    Another idea I head is to develop whole calibration routine on my own (using gaze SDK) completely in software. This routine would use raw data accessible through API to calculate gaze positions. But I’m concerned if enough data for this is provided through API? Documentation says that eye positions are locations of the eyeballs, not pupils. But I cannot calculate gaze directions using only eyeball positions, because I also need information about pupil location.

    Any thoughts on this? Like I said, my goal is calculating strabismus angle. Any suggestions on how to do this would be highly appreciated!

    #704
    Robert [Tobii]
    Participant

    Hi Pablo,

    I would like to help you, but I am not sure that the EyeX eye tracker is the right choice for you, since it is more of a low-cost, consumer-grade eye tracker rather than a research tool. On the Tobii Developer Zone we are focusing (!) on using eye tracking for creating new user experiences in consumer applications and games. The Research & Analysis branch of Tobii have better offerings for researchers. If your goal is to do ophthalmology research, you can start by taking a look at this page:
    http://www.tobii.com/en/eye-tracking-research/global/research/ophthalmology/

    #707
    Pablo
    Participant

    Hi Robert,
    thanks for your answer! Thing is, the reason I’m interested consumer grade devices is primarily because my budget is very limited. Also, their technical characteristics like accuracy and sample rate are sufficient for what I need. I’m willing to do some developing on my side, and I’m also experimenting with some DIY and open source solutions.

    My current idea is to use two sensors, one for tracking right and other for tracking left eye. From information you provided I’m guessing this is possible? It would be more elegant and cost efficient to use one sensor, but I’m guessing it will still be cheaper the purchasing a professional grade device…

    Anyway, thanks for your help. If you by any chance plan to update your API to export some more raw data, please let me know.

    Regards,
    pablo

    #716
    Robert [Tobii]
    Participant

    Hi Pablo,

    I would not recommend to use two eye trackers simultaneously, since the NIR illumination can interfere. You will probably get ok results with only one eye tracker and a one-eye calibration, then use both eyes in the actual measurements.

    To calculate gaze vectors with the Gaze SDK, use the Eye Position and Gaze Position points for each eye. They are both in the same coordinate system (UCS). See page 6 in “Tobii Gaze SDK Developers Guide – General Concepts”

    #718
    Pablo
    Participant

    Hi Robert,

    thanks for the info! I didn’t know I could get gaze position if the eye is not calibrated, but if that’s possible and the error is not significant, this will be a viable solution.

    Cheers,
    pablo

    #2165
    tom
    Participant

    Hi Pablo!

    I was also looking at this application for eye trackers. I’ve also asked on another eye tracker site, but it seems both of these track both eyes and can’t do it independently.

    Did you find any way to do this? Or another eye tracker which can do it?

    I want to create the possibility of detecting when the bad eye turns out or in and when that happens, pop up a message on the screen or some sort of notification, so the user will know when their eye is looking in a different direction and be able to correct it. Sort of a biofeedback thing which will hopefully allow the user to fix their strabismus, or lessen it.

    Would like to discuss this with you if you want.

    Cheers.

    #2206
    Pablo
    Participant

    Hi tom,

    my first advice to you would be not to do any calibration at all, and see how much error do you get. If the error is not significant, then you might not even need to calibrate, or you can compensate for the error in software (if it’s consistent enough). From my experience, this usually works well if the subject doesn’t wear any aid.

    I played with different eye trackers for this purpose and we can discuss this topic further, but let’s move it somewhere else because it’s outside the scope of this forum. You can drop me a contact, or propose some other channel for communication.

    cheers,
    pablo

    #2258
    Anders
    Participant

    Hi pablo, tom,
    you’re most welcome to use this forum for discussions about eye tracking software if you wish! As long as it’s about software development and eye tracking, it’s perfectly within scope.

    #3199
    Pablo
    Participant

    Here’s my solution for anyone interested:
    https://github.com/balancana/GazeMonitor

Viewing 11 posts - 1 through 11 (of 11 total)
  • You must be logged in to reply to this topic.