Home Forums Eye Tracking Devices EyeX vs. ITU Gaze Tracker

Viewing 9 posts - 1 through 9 (of 9 total)
  • Author
  • #1118
    Matt Chambers

    I’m looking for a device that’s accurate enough to support aiming in games like star Citizen where there is a mouse cursor that can control aim while a joystick/gamepad/keyboard is used to control movement. It doesn’t have to be pinpoint accurate, but something like within a 20 square pixel window. I’ve been looking at ITU Gaze Tracker (http://www.gazegroup.org/downloads/23-gazetracker) and it seems they have very good accuracy and precision (nearly as good as the commercial analytic devices), with relatively low cost hardware, but all they make is the software: you have to build your own rig, which includes IR lights to create glints on the eyeballs and an IR-filtered camera, with a zoom lens if you want to use it remotely instead of head-mounted. The good IR-filtered camera is $300, but there are cheap alternatives for $40, it’s just hard to find a zoom lens for them. Anyway, it seems like a company could commercialize this method.

    So naturally I’m curious if EyeX uses the same technology, using IR, or if it’s visible light. And if it’s not IR, why not? I’m also curious how the accuracy/precision of the EyeX compares to the GazeTracker system.

    I read that the REX system at least is capable of compensating for head movement toward and away from the camera, something the GazeTracker is poor at (but it has no problem compensating for “in-plane” movement). Is the EyeX also capable of compensating for toward/away motion?

    Also, why not combine eye tracking with head tracking? Since you have to track the head to compensate for its movement, you could make that a control axis like TrackIR/FreeTrack do. You might not even need reflective IR spots like TrackIR/FreeTrack require, you could ideally calculate the distance and angle between the inside/outside of the eyes and some other spot.

    Robert [Tobii]

    Hi Matt,

    Thank you for your questions, I’ll try to address them one by one.

    All Tobii eye trackers use near-infrared light in illuminators and sensors, allowing “accurate, continuous tracking regardless of surrounding light conditions” (copied from http://www.tobii.com/en/about/what-is-eye-tracking/)

    As I said in another thread, the accuracy/precision/robustness of the EyeX Controller is being continuously improved, so there is no point in comparing it with other eye trackers right now.

    Regarding head tracking, the EyeX Controller compensates for movements in all axes, but with some delay in the z-axis (since it is a mono-system with only one sensor, it requires the head to be at the same distance for a number of frames before it can compensate).

    So, using the eye tracker as a head tracker is probably not as accurate as dedicated head tracking system at the moment, but probably good enough for most applications. In the next version of the EyeX SDK for Unity we will include a code sample demonstrating how to use eye position data, to implement head-tracking-ish functionality in a game. We hope it will be useful for developers who want to experiment with this data stream and create innovative games based on the eye-gaze point on screen as well as the eye/head position.

    Matt Chambers

    Great. Thanks for the answers. That is a great and very informative link. Please add it to the main EyeX website (i.e. “How It Works”). Also, it’s apparently impossible to get from the main Tobii website to the EyeX website. Perhaps that is intentional (you don’t want researchers opting for the cheaper EyeX) :).

    Do you think the hardware is already sufficient to allow software improvements to achieve sub-degree precision, or will further hardware iterations be needed as well (similar to the Oculus Rift)? Are the “microprojectors” (I assume these are IR LEDs?) far enough away from the camera? I figured that the farther away they are from the camera, the easier it is to get information from all 3 points (the pupil and both reflections).

    AFAIK, you can’t do good 6DoF head tracking with only two points (the eyes). You have to get at least another point. How about an IR-reflector that sticks to the forehead or on a hat? I’d love for Tobii to knock Natural Point off its de facto monopoly on head tracking. The EyeX dev kit is already cheaper. The market needs some competition from a company that isn’t scared of their patent trolling FUD tactics. In the short term, since you are using IR, all you need to do is support integrating FreeTrack by letting it use your hardware. Then you’re REALLY close to a product that will have twice the functionality of TrackIR. Later you can integrate the algorithm they use into your own software and support both functions seamlessly.

    Jenny [Tobii]

    Hi Matt,

    Regarding getting from the Tobii Website to the EyeX website: there is actually a direct link to the Developer Zone at the top of the page, but I don’t know how many will actually find it. There are a couple of links to the main Eye Experience pages.

    Regarding improvements of the EyeX Controller: I believe the hardware will not be changed, but improvements will and can be made in the software. The distances between the illuminators and the camera are carefully selected to get the best possible kind of light reflections on the eyes for our eye tracking algorithms. There is no simple correlation of having better tracking the further apart they are placed, it is a matter of advanced optronics considering lenses, sensors, angles and more.

    Regarding head tracking: Yes, as you point out, there are limitations in using only two points for head tracking. I think we will start by evaluating what kind of interactions we can achieve using only the two eye positions and take it from there. But it is of course worth considering alternatives.


    Hi Matt

    I totally agree that you don’t get 6 DoF with only two points. You do get 5DoF though and with the positioning of the eyes the only one you miss out on is pitch (nodding up and down). Doesn’t take you all the way, but for many applications you get very far.

    Matt Chambers

    The classic use for head tracking in gaming is moving the view independently of a vehicle or body: e.g. looking around the cockpit of a plane, particularly useful for WW2 dogfight simulation. This definitely requires pitch control. I can’t see WW2 dogfight simulation having a lot of use for eye tracking unless it was used as an elaborate joystick. My intention has been to put it to good use in Star Citizen for aiming without moving the head. I tried doing just head tracking with FaceTrackNoIR and the mouse emulation worked fine on the desktop, but made the mouse inside SC go bonkers always going to the side. I’m afraid it would be the same for EyeX. It’ll require some different kind of emulation.

    David Tucker

    Sorry to bump an old thread, but I am very interested in this subject myself. I was just poring over the sample code trying to figure out how to use the EyeX as a head tracker. Right now I’m stuck on how to convert from UCS (user coordinates) to something that is relative to the center of the display (how do I detect and remove the pitch of the sensor). I’m sure a dive into the calibration data will get me there in the end.

    I’m really wondering if you could not ID other features on the face, such as the ears, in order to support a broader range of tracking. Most head trackers have a frustratingly small window that they work in and anything you can do to broaden the range would be good.

    I feel that one of the best uses for the EyeX in video games would be head tracking without needing a reflector, basically a specialized version of the Kinects that only concentrates on the head itself. That would allow you to use your monitor as a window into a 3D world and would really break you out of the box.

    Traditionally head trackers have been missuses to aim the camera, but really what you want is to track your head in order to create a ‘window’ into the 3D world so all you need to track is the x,y,z position of your head, relative to the center of the display (well really you want to know the physical view frustum, that is a tiny bit different)

    Jenny [Tobii]

    Hi David,

    Since you are talking about UCS (User Coordinate System) I’m guessing you are looking at samples in the Gaze SDK. It is possible to do what you want with the Gaze SDK, but quite cumbersome.

    It would be easier to use the EyeX SDK to access these values through the EyeX Engine API:
    – The 3D coordinate system of the eye position data stream is centered on the center of the screen and aligned with the screen plane (that is the screen the eye tracker has been set up on and calibrated for using the EyeX Engine).
    – The screen size is available in millimeters and pixels

    You would have to average the eye positions of the individual eyes to a single point, for example centered between them, and take into account that some data points in the eye position stream may only have data for one of the eyes.

    The cyclope’s eye position can be used in combination with the screen size to calculate an off-axis perspective projection matrix to create the ‘window’ effect you want.


    David Tucker

    Thanks Jenny, I will look into the EyeX SDK then, I had assumed that the Gaze SDK would give me more detail about the actual eye position, but honestly I would rather go with EyeX since it ‘just works’ out of the box.

    I did find it fascinating to see that I could get a reasonably accurate calculation for my own IPD just by looking at the difference between the two eye positions. I was expecting that value to vary quite a bit with my head position due to accuracy errors, but it seems to do a great job! It gave me some confidence that this may work.

    I noticed that fast movement (well not really slow movement) of your head in the horizontal plane causes the eye tracker to drop an eye, and/or causes some lag in the position of only one of the eyes. It seems like position of each eye is read out in separate passes instead of at the same time. I don’t know the details but presumably that is why the red lights are blinking all the time? I was wondering if you could not add in some prediction code that moves the eye position based on previous frames of motion. I know that head tracking is secondary to accurate eye tracking, but I would think that it would help improve the eye tracking as well. And since the eyes are always placed a fixed distance from each other (mechanically so to speak) prediction code should work well in this case. We use tricks like this in our own game to estimate where a car will be in the future, it is easy enough to look at the previous velocity and interpolate a new position. We take it a lot farther than that but it should be enough for your needs.

Viewing 9 posts - 1 through 9 (of 9 total)
  • You must be logged in to reply to this topic.