Tobii XR Devzone > Learn > Interaction Design > Use Cases >

Recommendations

Here, you can find a list of recommended scenarios where eye tracking interactions works great, what you need to know in order to implement them, and what we provide to help you do that.

We also talk about cases where it’s not recommended to add eye tracking and why that is.

Recommended Not Recommended
Target Guidance Retrofitting Complex Interfaces
Object Manipulation Laser Eyes
Social Avatars Blinking as an Input
Gaze Aware NPCs Eye Gestures
Teleportation Pixel Perfect Targeting
Intuitive Interfaces

Target Guidance

When throwing, shooting, using bow and arrow or similar methods of targeting, you can help users achieve their goal by guiding the trajectory towards the intended target.

  • This influences the game design and the challenge of the experience. Always hitting targets will become boring, which can be counteracted by having guidance as a temporary power up or by using a less aggressive guidance, fine-tuned for your specific experience.
  • People need a clear target to focus on. It’s difficult to, for example, throw a grenade at an area on the ground where there is no clear point of interest to focus on.

Related Material:

  • Read more about this in our Hand-Eye Coordination design section.
  • Try out and customize throwing in our Unity examples.
  • Throw magic stones at bottles or target flying drones with magic bullets in our Mirrors Demo.

Object Manipulation

Being able to manipulate objects from a distance, like picking things up, moving levers and pressing buttons can give the user a feeling of having superpowers or make the interaction feel effortless. It empowers the user, making them achieve their goals easier.

  • Gaze aware objects require spacing between them and visual feedback to signal that they can be interacted with to the user.

Related Material:


Social Avatars

If you have avatars controlled by users, you can make their eyes match the user’s. This will make each user’s avatar feel more alive and responsive and is especially powerful in multiplayer interactions where it adds a completely new layer of depth to the social interaction.

  • Simulating facial movements with eye tracking makes the avatar look more alive and combats the risk of an uncanny feeling when creating more realistic avatars.
  • Exaggerated facial movements can be dangerous because the user might not be aware of what expression their avatar is making. It’s important to convey this to the user.

Related Material:

  • Read more about avatars in our Social design section.
  • Try and customize avatars in our Unity examples.
  • Look yourself in the mirror while controlling an avatar in our Mirrors Demo.

Gaze Aware NPCs

Having non-player characters (NPCs) with gaze awareness makes your world and social interactions feel more immersive and life-like. By expressing emotions and reacting to the user’s gaze, NPCs can leave a strong impact.

  • While it is simple to implement basic behavior, more complex social interactions can be difficult to create.

Related Material:

  • Read more about NPCs in our Social design section.
  • Interact with NPCs in our Mirrors Demo.

Teleportation

Teleporting to specific locations simply by gazing at them and clicking to go there feels great. This is especially useful if you want your users to be able to quickly teleport or if their hands are occupied (e.g., using equipment such as guns).

  • Allowing free teleportation anywhere can be difficult because the eyes need specific stimuli to focus on. Having a strong point of interest to look at will allow players to teleport more easily.

Related Material:


Intuitive Interfaces

Gaze enabled user interfaces are quick and intuitive. To interact, users only have to look and press, instead of looking, pointing and pressing.

  • Requires understanding of the fundamental design principles of eye tracking and can be difficult to get right.

Related Material:


Retrofitting Complex Interfaces

Adding eye tracking to an existing UI or menu that was designed for another input, like a VR pointer, can be difficult. The challenges would be analogous to adapting a desktop interface to work in VR - using a new input method means that the layout may need to change.

  • Might need to design the UI from scratch based on eye tracking design fundamentals.
  • Can work in some cases, but heavily dependent on things like spacing, visual weight, animations and functionality.

Related Material:


Laser Eyes

Interactions such as having laser eyes or shooting at gaze comes with a list of complications. The main challenge is that we move our eyes subconsciously to read information about our surroundings, and if you use your eyes to shoot things, you may accidentally shoot a lot of things. When using eye tracking, you should allow people to move their eyes naturally without unintended consequences.

  • Most eye tracking interactions should have an explicit activation method, such as pressing a button, to allow users to look around without accidentally activating objects.
  • Can be referred to as ‘Midas Touch’, meaning everything you look at turns to gold.

Related Material:


Blinking as an Input

Blinking as an input method is in most cases a bad idea.

  • We need to blink now and then which makes accidental activation unavoidable, leading to a bad experience.
  • It is straining to blink on command, and feels unnatural.

Related Material:


Eye Gestures

Asking users to make a gesture with their eyes is straining and difficult, unless they have a moving point of interest to follow (this is refered to as ‘smooth pursuit’). Forcing the user to make gestures with their eyes to activate something without showing a moving target to follow should be avoided.

  • It is straining to move our eyes in specific patterns, and during this, we are perceptionally blind to our surroundings.

Related Material:


Pixel Perfect Targeting

Creating experiences that depend on eye tracking being able to target elements with a pixel perfect accuracy is a bad idea. This doesn’t work for VR in general and is especially true when designing for eye tracking, where people have various degrees of eye tracking quality.

  • Due to the nature of our eyes, we cannot focus on pixels, we need bigger point of interests to look at.
  • Elements that can be interacted with needs to be spaced apart or have a way of being multiselected (e.g., objects that can be picked up needs to be spaced out or all nearby objects become picked up when interacted with).
  • Certain visual tricks can be used to make it easier for users to focus on elements, like centering the visual weight.

Related Material: