Over the years we at Tobii have experimented with many different ways of using gaze data to enhance the experience of interacting with computers. It’s been a lot of fun, there have been many prototypes and even more crazy ideas. We’ve narrowed down four guiding principles that will hopefully help you avoid some pitfalls, as well as inspire you to innovate something cool!
Eyes are made for looking around
Your eyes are constantly moving around to help the brain build the image that you see. In order to avoid eye strain and fatigue it is important to allow the eyes to move freely. For example, an interaction that requires the eyes to focus on one spot for too long can be straining. In a similar way, explicit motor tasks such as gestures, drag and drop, or blinking to click are generally not natural for the eye. Make sure to design your application in a way that you let the eyes be eyes.
Eyes and hands work well together
So if we let the eyes move freely, how can we then interact? Well, the hands are great for physical interaction, and the good news here is that eyes and hands are naturally synchronized. In many cases there is a sweet spot in time where the eyes naturally look at what you interact with, thus providing input that can make the physical interaction much more direct. For example, pressing physical button to click on the on-screen button you’re looking at feels entirely natural. Eyes don’t necessarily replace the hands, they complement them.
Eyes are curious
Your eyes move at will when you want to look at something, but they also move unconsciously. They naturally land on texts and images, are drawn to contours, follow movements, etc. They can be both guided and distracted depending on your visual design. For example, having a strong contour of your on-screen button, or placing the text close to the edge, will make it harder to interact with in a robust way since your click risks missing the target. Your eyes will also be drawn unconsciously to movement and contrast in your peripheral view. How does your visual design and layout affect the eye interaction?
Eye movements provide information
Using eyes for explicit interaction such as clicking on buttons in a user interface is awesome, but there’s more to eye interaction than that. By analyzing how the eyes are moving one can start exploring the growing field of implicit interaction: While the user is using the product as she normally does the interaction can be further enhanced. For example turning on the display when the user looks at it, removing a notification once it has been read, or enabling a game character to be responsive to your gaze.
We hope you found this helpful. Enjoy, and please feel free to comment, discuss this, or share additional knowledge. Oh, and by all means, experiment and innovate!