Tobii XR Devzone > Learn > Interaction Design > Use Cases > Social >

Advanced

This page contains more advanced learnings about designing for social avatars:

Table of Contents


Gaze Filters

When creating eye behavior for player-controlled avatars in VR, it’s important to make the eyes behave naturally. Here we take some of nature’s rules and apply them to the gaze signal to achieve as close to real eye behavior as possible.

As highly social animals, humans need to be able to quickly and easily identify people and their moods by sight, this means we have evolved heightened sensitivity to facial features and expressions. Eye movements that do not follow “the rules” are immediately noticeable therefore it is important to spend some extra time trying to filter out some of those Uncanny Valley candidates as early as possible. Here are some suggestions on how to use those “rules” to filter the gaze signal and create a more robust and natural looking avatar eye control system.

Divergence/Convergence

Real life eyes converge when viewing closeup objects (or when making silly faces) but they almost never diverge. We can use this fact to filter divergence and strengthen the signal and hence realism. If the gaze directions of the left and right eye diverge then combine them and take the average for both eyes.

Combined Vertical Axis

Eyes typically never point in different directions on the vertical axis. You rarely see someone with one eye looking up and the other looking down. This physiological characteristic can be used to our advantage by combining the left and right eye’s vertical component, delivering a more robust and realistic result.

Smoothing

Jittering eyes can appear extremely unnerving. A simple averaging filter can be used to great effect, bearing in mind too much smoothing can make eyes look sleepy or unwell. You might also like to consider increasing the amount of smoothing at greater angles as gaze signal noise can increase at higher angles.

Clamping

The average pair of human eyes rarely look up more than 25 degrees, or down more than 30 degrees, nor do they look left or right further than 35 degrees. This offers another small filtering opportunity for avatar eye control. This angular clamping is useful by itself but even more so when considering avatar model designs with larger eyeballs and smaller eye openings.

Cross Eyed Correction

On the subject of avatar model design, it can happen that a model appears cross-eyed (or boss-eyed) even when they actually are not, adding the option to adjust for this is good design practice.


Facial Expressions

It’s not only the eyes that bring an avatar to life, facial movements also play a significant role. Although we cannot accurately predict facial expressions from eye direction alone, we can link their motion into subtle facial expressions to make them feel more alive.

Feature Transitions

Simple facial feature transitions can be used to bring a face to life, for example moving the eyebrows and the mouth.

It’s important not to overdo facial expressions based on the user’s eye movements. Users might be unaware of what facial expressions their avatars are making and it might not reflect their real facial expressions and mood.

Blend Shapes

Using Blend Shapes (or morph targets) with eased animation curves allows for more complex and convincing expressions.

Micro Expressions

Real living faces are rarely motionless, facial muscles are constantly shifting and adjusting very slightly. These movements are known as “micro expressions”. Adding this kind of random motion can also add an extra touch of realism.

Fallback Gaze Control System

Although eye tracking might be a killer feature in social you don’t want users without an eye tracker, or the few whose eyes cannot be tracked, to suffer unnecessarily. It is therefore worth building in some kind of backup system. There are some off-the-shelf solutions that can add some sort of eye movement in such cases. They can also add random blinking, micro and macro saccades and facial micro expression.


Shared Gaze

Leveraging user attention by sharing the users’ gaze can improve both collaboration and teaching. It can make complex tasks easier and decrease the time needed to complete tasks.

Sharing gaze can be done in several ways, for example by simluating avatar eye movement, highlighting objects being focused or simply by visualizing a ray from the avatar’s eyes.

Collaboration

In this video, two users are collaborating in an office setting. With the help of shared gaze, users can more easily understand what the other user is talking about.

From an audience perspective (third person), it’s easier to understand the context and adds possibility to analyze the collaboration.

  • Hide your own gaze ray visualization from yourself so that it’s not annoying. It is a good idea though to show the highlight of the object you are looking at.
  • Fade out and hide the other user’s ray if it is looking directly at you to not get a ray in your eyes (could be compared with having a laser pointer in your eyes).
  • Users should be able to toggle self and others’ avatars gaze ray visualizations, otherwise it might get distracting in certain situations or when there are several other avatars.

Teaching

In this video, the user to the right is teaching the user to the left how to put together a model by utilizing shared gaze. The teacher simply looks at the next model part which the other user then picks up (using gaze grab) and places on the table according to the teacher’s instructions.

This will decrease the time needed to complete the task because there is no need to manually point with a laser pointer or describe the object - you just look at the object of interest and the other user(s) understands immediately. It also frees up your hands from doing a pointing gesture.

When visualizing avatars gaze, have the length adjust to stop at the looked at target. This makes it easier to clearly see when another user is looking at an object.

Opponent Insights

Seeing what another user is looking at can provide contextual information and hints about what they’re thinking about and planning.

Take the game of chess for example: by seeing where your opponent is looking you can gain insights into what they might be aware of and planning. This could be an advantage, especially if only one of the players get to see the opponent’s gaze pattern.

In this video, two players are playing a game of chess. The opponent’s eye movements are animated and a gaze ray is visualized to see where on the chess board they are looking. The player’s own chess pieces are highlighted when looked at.

A gaze bubble visualization has been added to make it easier for the viewer of this video to see where the player is looking, but is not visible to either players.

If a player is aware that their gaze is being visualized for the other player, they can try to use this to deceive them.

Shared Gaze in Groups

Another interesting take on shared gaze is how it can be used in a group setting. For example, showing aggregated gaze highlights around objects of interest could be useful for a teacher to know if students are paying attention or being able to improve presentations or business meetings by knowing the interest of the audience.

The teacher/presenter should have full control of the gaze visualizations and should most often be hidden from the audience to avoid distractors - unless there's a point to be made from showing it.