Home Forums Software Development Tracking objects with the Eye Gaze Engine

Viewing 4 posts - 1 through 4 (of 4 total)
  • Author
    Posts
  • #413
    Elias
    Participant

    Hi,

    I’m new to this community and would like to say hello to everybody first. I read all posts on this forum and would like to discuss one of the very crucial requirements for any good gaze supported software: the tracking of objects.

    In the “To what degree is EyeX ready to go out-of-the-box?” topic you commented on the tracking system of the Eye Gaze Engine:

    Gaze clicking the way you suggest can be a good use case for eye tracking as long as the accuracy is sufficient. (The EyeX Engine improves the accuracy by letting the gaze click snap to the nearest activatable interactor within range.)

    Can you describe your approach in more detail? While this seems to be a good tracking algorithm for the start I am wondering if you considered more advanced tracking algorithms, for example using Hidden Markov Chains. This implementation seems quite promising. The research paper can be found here.

    IMHO tracking is the most crucial issue progressing to expand the capabilities of eye controlled systems. Therefore I am wondering which features will be provided by the Eye Gaze Engine and which tasks have to be addressed in the application domain.

    I am considering implementing some of these features by myself but I think it is best for all of us if you comment on the current status of your development and further plans.

    Thanks!

    #428
    Robert [Tobii]
    Participant

    Hello Elias,

    Welcome to the community! Thank you for a good question, I agree completely that object tracking/snapping is a key feature that every gaze driven application needs.

    Right now I’m afraid I cannot give you any implementation details on the EyeX Engine approach, more than that it is based on a probability model that takes into consideration parameters like object size, z order etc. I will see if we can share more info when we get closer to the 1.0 release.

    We have been refining the EyeX Engine object (interactor) snapping continuously while developing our application software and running user tests. Our primary goal is to have something that works out-of-the-box and makes it easy for developers to create applications and games without having to bother about filtering raw data or creating their own object tracking/snapping model.

    While we think that most developers will be happy with using the EyeX interactors and out-of-the-box snapping, there will of course always be advanced users (like you?) who are capable (and have the interest/time) to experiment with different approaches. That is why we also give access to the raw data from the EyeX Engine, so applications can mix and match between using interactors and their own raw data based algorithms if needed.

    We have also seen the GDOT video and paper. It looks like a promising implementation as you said. So far we haven’t had time to experiment and compare the EyeX Engine with the GDOT algorithm. It would be a good test to create an application in which you could switch between EyeX interactors and GDOT and measure the results. Right now we don’t have it in the roadmap for the coming months, but if some 3rd party developer (wink wink) or researcher (I will contact the GDOT team) can prove that the GDOT approach is far superior we will have to re-prioritize.

    Anyone accepts the challenge? 🙂

    #491
    Robert [Tobii]
    Participant

    Update: I contacted the researchers behind the GDOT algorithm. They have been kind enough to publish their Matlab code on http://rmantiuk.zut.edu.pl/index.php/gdot/.

    We will evaluate and compare this algorithm with ours when we have time, until then you are welcome to do the comparison yourself if you are interested.

    #530
    Elias
    Participant

    First of all thank you for your effort.

    I will definitely take a closer look at the algorithm. Unfortunately, my order of the REX is still pending and I have to wait a few weeks before I can start testing.

    Are you aware of other promising algorithms for gaze tracking? As I mentioned elsewhere I am planning to use the tracker to control robots and therefor have to implement a robust tracking algorithm for moving objects in a 3D scenario.

    During this project, I will definitively investigate options to enhance the precision of tracking in an ordinary desktop environment as well, as a side project. I hope we can achieve a level of accuracy that in the very most cases it is possible to control an ordinary application written for mouse and keyboard. I know you are working on a piece of software for a new experience and hope, this will be a big improvement over existing approaches.

Viewing 4 posts - 1 through 4 (of 4 total)
  • You must be logged in to reply to this topic.