Home Forums Software Development Calibration-less mode or calibration with custom look

Viewing 11 posts - 1 through 11 (of 11 total)
  • Author
    Posts
  • #582
    JulioO
    Participant

    Hi!

    We just ordered our first Rex tracker but need to know a couple things that are critical for our application. The application is a simple interactive game where we just need to track the X-coordinate of the gaze (on screen), as an animated character will walk to always follow your gaze.

    We don’t need much accuracy (3cm range is fine), but we will have a lot of different people playing the game, so 2 questions:

    • Can the tracker work with decent accuracy without calibrating for each user?
    • If not, does the EyeX SDK already support custom calibration? I mean, a calibration with our own look, inside our app, so that it does not disturb the experience of the user?

    Thanks so much in advance!
    Julio

    #583
    Anders
    Participant

    Hi Julio,
    I think that you can get an accuracy in the 3 cm range without individual calibration for most users. But to be honest: I don’t have any data to support that claim, only my own experience with the product. I’d recommend that you start out without calibration and see if the accuracy is good enough that way. If not, you could consider a simplified two-point or five-point calibration instead of a full-blown nine-point.

    The EyeX SDK doesn’t support custom calibration, but the Gaze SDK does, and it is possible to use both in the same application.

    #584
    JulioO
    Participant

    Hi Anders,

    Thanks for answering so quickly! Great news that we can do custom calibration with the Gaze SDK. That way we’ll do 2 point and I assume that will be more than enough. From what I understand we’d do the custom calibration with the Gaze SDK and then pass the calibration to the EyeX SDK in order to get the eye tracking information through it afterwards (we will want to use the built-in filtering).

    I have another question, but please let me know if I should move it to a new post:

    We need to use the tracking on a TV monitor that is larger than 24″, either 32″ or 40″. We don’t need much accuracy but we’d need to go this large (but we just care about X-axis position). I assume we could just set EyeX the screen size or make some kind of interpolation to make it work (even if it’s not super accurate)?

    I guess moving the tracker forward would allow us to work with a larger display, but we need to keep the tracker in the same “plane” as the monitor to avoid having the tracker too visible in the space.

    Any advice would be amazing. Thanks again!!

    Julio

    #587
    JulioO
    Participant

    Hi Anders,

    Already got the unit and starting development 🙂

    Sorry for re-posting, but it’s a question related to the initial post (the question on how to handle larger displays is still open and would be amazing to have a recommendation).

    Regarding calibration with custom look using Gaze SDK in combination with EyeX SDK for everything else, I’m trying to see how to connect the two. I see there is a sample in the Gaze SDK called MinimalCalibration, which demonstrates clearly how to do a custom look calibration.

    In the function handler: void stop_calibration_handler(tobiigaze_error_code error_code, void *user_data), user_data stores the just finished calibration from what I understand.

    The problem I have is how to pass this calibration to the EyeX SDK, as it by default uses the calibration stored in the system. Looking at the documentation it always says that the EyeX system takes care of the calibration, so I can’t see any function to pass EyeX SDK the calibration done with the Gaze SDK in the same application.

    Any help with be amazing as we’re very stuck with this.

    Thanks in advance!!

    Julio

    #588
    Anders
    Participant

    Hi Julio,
    the larger screen sizes shouldn’t be a problem as such, since you have limited needs for accuracy. The viewing distance might be more of an issue: the user will still have to be within the track box and people tend to stand further away from a larger screen. But it’s worth a try!

    The EyeX Engine can be configured for larger screens and non-standard mountings. It requires some configuration hacks, though, and that workflow hasn’t been documented yet. I’ll post an update as soon as I have something I can share.

    Regarding the sharing of calibrations: the eye tracker will always have a “current calibration” which it keeps until it’s disconnected or power cycled. The EyeX Engine downloads the calibration for the current user profile when it connects to the tracker, as well as when you switch profiles. If you connect to the tracker using the Gaze SDK and run a calibration, it will override the calibration that the EyeX Engine has set. So you don’t need to do anything else in addition to just running the calibration: it will take effect immediately and will be used until another calibration is set.

    #591
    JulioO
    Participant

    Hi Anders,

    Thanks again for your answers!! Some comments/questions/findings:

    – Larger screen size: we have tested it with a 37″ display, and the EyeX itself works very well in terms of tracking. It has no issues. The problem is exactly what you suggested, people would need to be farther away with such a large screen and from our tests, the tracker starts loosing the eyes from 90cm on (distance eye-to-screen). With this 90cm as max eye distance limit, it seems that a screen of 32″ would be the max size that people would “allow” without wanting to go farther from it.

    I guess the tracker would need to have a tele lens on the camera to be able to track the eyes farther away.

    My question is: when you mention: “The EyeX Engine can be configured for larger screens and non-standard mountings. It requires some configuration hacks”… what do you mean? From our tests it worked ok so I’m wondering what would those hacks be for?

    – Regarding the calibrations mixing Gaze and EyeX SDK, you’re totally right, works like a charm and we can mix them without communicating them in terms of code 🙂

    Thanks again!
    Julio

    #593
    Robert [Tobii]
    Participant

    Hi Julio,

    Anders is not awake yet (he is in San Fransisco at the Game Developer’s Conference), but I think I can answer your new question.

    Great that it works for you even when using larger screens. What Anders meant with “non-standard mounting” is that it is possible to detach the eye tracker from the screen and mount it in front of the screen, closer to the user, to allow for longer distances between the user and the screen. But that requires some advanced screen mappings that is not officially documented or supported at the moment.

    Looking forward to seeing your game in action!

    #594
    JulioO
    Participant

    Hi Robert,

    Thanks for your answer! I understand now. Initially we will try to hide the eye tracker as much as possible so will try to have it in the same “plane” as the screen (below it). But it would be great to know what code “hacks” we’d need if we finally needed to put the tracker closer to the users in case they tend to move backwards too much.

    We’re closing development next week so timeline is tight, but would it be possible to have those hacks? or to know what we’d need to do in order to have the tracker closer to the user? Would be amazing.

    Thanks again!
    Julio

    #611
    JulioO
    Participant

    Hi again 🙂

    Another question here about the custom look calibration:

    We have implemented the custom look tracking in our app, mixing both EyeX and Gaze SDKs as suggested and it’s working well, but we were wondering one thing to make it as robust as possible: In the Gaze SDK, when we call tobiigaze_calibration_add_point_async to track the eyes for a certain normalized screen coordinate, we don’t know if the system gets a single measurement of the eyes, or a few and averages them? We were wondering in case the person blinks at that point or something like that… so were thinking about calling tobiigaze_calibration_add_point_async several times per tracking coordinate, in order to average the measurements?. But we don’t know if that will work well, or even if the Gaze SDK is already averaging a few measurements (as we see the IR lights blinking several times).

    Any advice would be amazing to make this robust.

    Thanks!

    Julio

    #612
    Anders
    Participant

    Hi Julio,
    the add_point_async operation will collect data until it either has enough data points, or it times out. You should only have to call the function once per calibration point.

    The MinimalCalibration sample isn’t very clear on that point, but the idea is that you start displaying the calibration point in the GUI as soon as you have called add_point_async, and keep it there until you get the callback from that function.

    #614
    JulioO
    Participant

    Hi Anders,

    Thanks for the details, makes total sense now, and great that just once call to add_point_async is enough… easier that way :).

    Best,
    Julio

Viewing 11 posts - 1 through 11 (of 11 total)
  • The topic ‘Calibration-less mode or calibration with custom look’ is closed to new replies.