Home › Forums › XR Discussions › How to get eye coordinates from HTC Vive Pro Eye
Tagged: Eye coordinates, eye tracking, HTC Vive Pro Eye
- This topic has 23 replies, 5 voices, and was last updated 2 years, 6 months ago by Grant [Tobii].
- 07/10/2019 at 17:49 #12061stellaParticipant
Thanks all for your responses. i was having similar questions and your answers were helpful to me.07/10/2019 at 20:00 #12063
Hi @dragontattoo, great! glad to hear it 🙂 Please don’t hesitate to get in touch again should you require any further assistance. Best Wishes.11/12/2019 at 14:14 #12464Marco OtteParticipant
Here’s what I did and as far as I can see it works.
I’ve use the Sample Scene of the Tobii SDK (TobiiXR for Unity package) and cut most of the code from the GazeVisualizer there. So take a look there is you want too!
This is the abbreviated bit of code:
public class EyeTrackOnScreen : MonoBehaviour
float _defaultDistance = 5f;
IEyeTrackingProvider provider = TobiiXR.Internal.Provider;
TobiiXR_EyeTrackingData eyeTrackingData = EyeTrackingDataHelper.Clone(provider.EyeTrackingDataLocal);
Matrix4x4 localToWorldMatrix = provider.LocalToWorldMatrix;
Vector3 worldForward = localToWorldMatrix.MultiplyVector(Vector3.forward);
TobiiXR_GazeRay gazeRay = eyeTrackingData.GazeRay;
Vector3 screenPosition = gazeRay.Origin + gazeRay.Direction.normalized * _defaultDistance;
float screenX = transform.localPosition.x + (Screen.width * 0.5);
float screenY = transform.localPosition.y + (Screen.height * 0.5);
Note: it is the Game screen that is calculated and so you have to take into account what size the game screen is at run time. If it’s full screen the gaze coordinates will conform to the resolution of your monitor. Is it running in a window, then the inner dimensions of that window will determine the max X and Y.
The screenPosition vector has its origin at the center of the screen.
The screenX and screenY are adjusted for the current Screen size.
Marco13/12/2019 at 12:55 #12473
Hi Marco, that’s great! Thanks so much for sharing your experience, much appreciated. We hope other users will find it useful as well 🙂17/07/2020 at 17:06 #18450Supriya RaulParticipant
As a workaround for this problem, I placed a mesh closely in front of the camera aligned with the user’s head movements. Then, I cast a ray using the gaze origin and gaze direction to find a point of interaction with the mesh in the world space. These world space coordinates are then transformed into normalized viewport space to get the desired gaze coordinates. Hope this helps anyone who has a similar requirement!17/07/2020 at 17:14 #18451Supriya RaulParticipant
@Marco Otte has also suggested another interesting solution for this!20/07/2020 at 10:58 #18465
Thanks for sharing Supriya, your input is most appreciated! 🙂10/11/2020 at 14:41 #19271Oliver PetschickParticipant
I’m trying to get the x and y coordinates aswell – Supriya Raul can you maybe explain your solution?
Oliver11/11/2020 at 10:52 #19277
Pinging @supriya for you 🙂
- You must be logged in to reply to this topic.