Home Forums Software Development tobii REX accuracy

Viewing 2 posts - 1 through 2 (of 2 total)
  • Author
    Posts
  • #725
    adrian
    Participant

    Can anyone please assist me with a general understanding of the tobii REX? When Anders@tobii states that the eye tracker is accurate to 0.4 degrees, what exactly does that mean? If two object are displayed on a screen and are about one inch apart, how can I explain to someone else that the object that you **intend** to select **will** be selected? How close would objects need to be in order for the device to not select the object you intend to look at?

    These questions may be elementary, but I am quite eager to hear your explanations and prior experience with tracking relatively close objects.

    Thanks.

    #727
    Anders
    Participant

    Hi adrian,
    I’m glad you asked. These questions are not at all elementary!

    First things first: the statement that “the eye tracker is accurate to about 0.4 degrees” means that, given that the eye tracker is set up and calibrated properly, the measurement error between user’s actual gaze point, and the point that the eye tracker reports as the user’s gaze point, is on average 0.4 degrees. The average is taken over several users and over several points spread out on a typical screen.

    More info about the definitions and test methods for accuracy and precision can be found here.

    But then, how does the accuracy figure translate to selecting the right thing?

    Suppose that you have an accuracy of 0.4 degrees and absolutely no noise — that is, perfect precision. Then you would know that the reported gaze point is off by 0.4 degrees, or about 0.5 cm at 60 cm distance, on average.

    Now, you want to make sure that the user can select an item, and that the selected item is the right one. To ensure that an item can be selected, it has to be made large enough. Take the region within which you want the item to be gaze selectable and add a margin of 0.5 cm (it’s so easy with perfect precision!). Note that the gaze selection region doesn’t have to be the same as the visual bounds of the item on the screen.

    And then, to make sure that the right item is selected, make sure that no two gaze selectable regions overlap.

    But in the real world, there is noise, which makes it all much more complicated, and there is also the fact that accuracy and noise aren’t the same at all points on the screen, but tend to get worse the larger the angle between eye tracker and gaze point gets.

    The EyeX Engine does a lot of data processing to ensure that the right items are gaze clicked. But you can help, too, when designing your user interface. Make things large enough, and lay them out with some spacing. As a rule of thumb, if it works with touch, then it should work with gaze as well.

Viewing 2 posts - 1 through 2 (of 2 total)
  • You must be logged in to reply to this topic.