In this post I’d like to highlight four ways you can create a truly immersive game experience using eye tracking:

Natural Targeting
Give your game character the ability to run in one direction and simultaneously aim, shoot or pick something up in another direction.

Using eye tracking you can know where the player is looking on the screen, and if you know where the player is looking you can use that information as an additional input to your game. The Tobii EyeX Engine API provides eye-gaze data streams that can be used for this purpose. In the game Son of Nor, the Tobii EyeX SDK for Unity was used to access an eye-gaze data stream. The player can pick up stones from anywhere on the screen and throw them anywhere on the screen, just by looking at them and pressing a mouse button. Son of Nor’s lead developer Julian Mautner talks about how he integrated Tobii EyeX into the game in this Dev Diary (starts at 2:35):

An eye-gaze data stream cannot give a pixel perfect location of where the player is looking, eye tracking just can’t deliver that kind of precision. Julian developed an intelligent system with filtering, ray-casting and heuristics to determine what object the player is looking at, so that the correct item is picked up when the mouse button is pressed.

Response to Eye Contact
Make characters or environments in a game more alive by reacting to your eye contact.

Eye contact is a powerful tool of communication in our daily lives when interacting with other human beings. With eye tracking it is now possible to implement this way of communication also in a video game. Of course you can achieve this by filtering an eye-gaze data stream, using ray-casting and implementing heuristics to find the objects the player is looking at. But the Tobii EyeX Engine actually comes with built-in support for tasks like these. The Tobii EyeX API provides the possibility to define regions on the screen that are gaze-aware. A gaze-aware region will know when the player starts looking at it, and when the player stops looking at it. It is also possible to specify a certain time delay how long the player has to look at the region before it considers itself to be looked at. To implement a game character that is aware of eye contact, simply put a gaze-aware region around the area of the character’s eyes:

An example of a GazeAware region to detect eye contact.

In the Tobii EyeX SDK for Unity, game objects can be made gaze-aware by simply adding a GazeAware component to them. This tutorial explains how simple it is to get started with EyeX in Unity and create a cube that spins when you look at it: Getting Started With the Tobii EyeX SDK for Unity.

Immersive Graphics and Sound
Create a game where looking at specific objects in the game triggers them to produce a noise or react.

Where you are looking also approximates where your attention is. In a crowded situation you are able to focus on the person you are talking to and filtering out other sounds and most people moving about around you. With eye tracking it is possible to achieve similar effects in a game. You can adapt the sounds in the game according to where the player is looking. You can also make the graphics adaptive creating a dynamic depth of field effect:

In the Tobii EyeX Plugin for Unreal Engine 4 we have included a sample to show you how you can create an adaptive depth of field effect like this. It takes the distance from the camera to the point where the eye-gaze meets an object, and uses that as the focal distance in a post-process shader that creates a really short depth-of-field around the given focal distance.

Infinite Screen
Make your players feel as if they stepped right into the game by using the player’s eye-gaze to control the game character’s field of vision.

Traditionally in third-person games, you use the mouse to pan the scene camera. The screen is perceived as something static with fixed borders and you move your mouse about to change your field of view. With the Tobii EyeX Engine API you have the possibility to use the player’s eye-gaze to control the field of vision. By just looking around, the scene camera can pan and adapt to where you want to look. The expansion of the field of view will be limitless, creating an infinite screen effect. To implement this you use one of the eye-gaze data streams to know where the user is looking, and then you create a smooth algorithm to move what the player is looking at towards the center of the screen. Typically you would want to have a very low or no panning effect close to the center of the screen, and dynamically faster and more responsive panning the further out to the edges of the screen the player is looking.

In the video below, Ubisoft producer Corneliu Vasiliu shares some insight into the implementation of an infinite screen effect in Assassin’s Creed Rouge PC, using Tobii EyeX:

Are you working on an interesting project involving eye tracking, or is this something you would be interested in integrating into your product or game? Let us know!