Home Forums XR Discussions How to get eye coordinates from HTC Vive Pro Eye

Viewing 9 posts - 16 through 24 (of 24 total)
  • Author
  • #12061

    Thanks all for your responses. i was having similar questions and your answers were helpful to me.

    Grant [Tobii]

    Hi @dragontattoo, great! glad to hear it 🙂 Please don’t hesitate to get in touch again should you require any further assistance. Best Wishes.

    Marco Otte

    Here’s what I did and as far as I can see it works.
    I’ve use the Sample Scene of the Tobii SDK (TobiiXR for Unity package) and cut most of the code from the GazeVisualizer there. So take a look there is you want too!

    This is the abbreviated bit of code:

    using UnityEngine;
    using Tobii.XR;
    using System.IO;

    public class EyeTrackOnScreen : MonoBehaviour
    float _defaultDistance = 5f;
    void Update()
    IEyeTrackingProvider provider = TobiiXR.Internal.Provider;
    TobiiXR_EyeTrackingData eyeTrackingData = EyeTrackingDataHelper.Clone(provider.EyeTrackingDataLocal);
    Matrix4x4 localToWorldMatrix = provider.LocalToWorldMatrix;
    Vector3 worldForward = localToWorldMatrix.MultiplyVector(Vector3.forward);
    EyeTrackingDataHelper.TransformGazeData(eyeTrackingData, localToWorldMatrix);
    TobiiXR_GazeRay gazeRay = eyeTrackingData.GazeRay;

    Vector3 screenPosition = gazeRay.Origin + gazeRay.Direction.normalized * _defaultDistance;
    float screenX = transform.localPosition.x + (Screen.width * 0.5);
    float screenY = transform.localPosition.y + (Screen.height * 0.5);

    Note: it is the Game screen that is calculated and so you have to take into account what size the game screen is at run time. If it’s full screen the gaze coordinates will conform to the resolution of your monitor. Is it running in a window, then the inner dimensions of that window will determine the max X and Y.
    The screenPosition vector has its origin at the center of the screen.
    The screenX and screenY are adjusted for the current Screen size.

    Have fun!


    Grant [Tobii]

    Hi Marco, that’s great! Thanks so much for sharing your experience, much appreciated. We hope other users will find it useful as well 🙂

    Supriya Raul

    As a workaround for this problem, I placed a mesh closely in front of the camera aligned with the user’s head movements. Then, I cast a ray using the gaze origin and gaze direction to find a point of interaction with the mesh in the world space. These world space coordinates are then transformed into normalized viewport space to get the desired gaze coordinates. Hope this helps anyone who has a similar requirement!

    Supriya Raul

    @Marco Otte has also suggested another interesting solution for this!

    Grant [Tobii]

    Thanks for sharing Supriya, your input is most appreciated! 🙂

    Oliver Petschick


    I’m trying to get the x and y coordinates aswell – Supriya Raul can you maybe explain your solution?


    Grant [Tobii]

    Pinging @supriya for you 🙂

Viewing 9 posts - 16 through 24 (of 24 total)
  • You must be logged in to reply to this topic.