Tobii XR Devzone > Learn > Eye Behavior >

Hardware Accuracy


User Accuracy

Below is an actual screenshot from HTC VIVE, which has roughly a visual field of view of 90° both vertically and horizontally. For reference, note that the thumb in the bottom right is about 2° wide. Our data shows that 95% of Tobii users can track better than 2° when their eyes are looking straight ahead.

Play around with slider below to better understand what size objects you can expect people to consistently interact with while using Tobii XR SDK. Mousing over a point will tell you that point’s inaccuracy. A setting of 95% means that out of 100 people, you are seeing the 95th least-accurate person’s result. Therefore, 94% of people would have a better accuracy than what is shown.


% of users have this accuracy or better

After playing with the Bubble Graph, you’ve probably noticed that a very small percentage of our users have somewhat large inaccuracies as their gaze moves farther from center. But the Heatmap is showing you the angles where users actually gaze, and accuracies remain very good for all users within the areas where people most frequently look.

Remember, humans prefer to look straight ahead and we tend to turn our heads instead of our eyes for large differences in gaze angles.


Drag Tool

Having seen the range and trends of user input on the Bubble Graph, feel free to click-&-drag around the visualizer on the second tab above called Drag Tool.

This is a different way of showing the same data where mouse clicks interactively show you what object sizes can easily be interacted with at different angles and for different percentages of users. We fade the circle’s opacity based on the heatmap – the more transparent the circle becomes, the less often the user will look at that angle without turning their head.

Understanding this, it is fun and informative to toggle off the Heatmap here and measure different object sizes in the scene, showing what percentages of users can hit different sized objects at different angles.

The data included in these graphs is based upon a small internal test at Tobii and shouldn’t be taken as our hardware’s official numbers (see link at bottom of page).

Also note that we only tested points out to 25 degrees from center. As such, areas outside that on the Drag Tool are extrapolated and probably won’t reflect real-world performance. Similarly, this data is purposefully presented in a simplified manner and true individuals will likely see more variance from point to point. These numbers are presented as purely a learning tool.

We’re hoping you take away an understanding that eye tracking gives larger inaccuracies the farther from center you look, that people usually turn their heads to minimize the occurrence of large gaze angles, and that eye tracking interactions will be slightly different between various individuals.


Conclusions and Tools to Help

Tobii hardware is robust enough that everyone should have great eye-tracked interactions, even though personal behavior and hardware response will vary between individuals and activities. “Normal”-sized objects should be easily interactable inside of humans’ most-preferred viewing angles, and tiny objects might be difficult for people to interact with.

To further improve the user’s ability to interact with objects, we provide a machine-learned interaction layer for you to integrate into your applications:

And to help disseminate our knowledge base, we have outlined a set of visual feedback design guidelines to help you increase user confidence, along with UI examples and reusable Unity scripts to make this easy for you to implement into your own code:

Further Reading