- 28/05/2015 at 09:57 #3008Mark GatrellParticipant
I am using the EyeXHost engine to determine the current gaze point etc .
Using C# (GazePointDataStream) all is working ok.
I would like to control a virtual world with the controller , but need to feed the gaze coordinates to another machine controlling the virtual world.
So I need to mount the sensor on the virtual world screen and connect it to a machine running the EyeXHost software via c#.
My question is this:- Can I calibrate the sensor by telling the user to look at a centre point on the Virtual world machine and log the calibration point in my application.
I cannot show the EyeXHost calibration application on the virtual world display!
Thanks for your help01/06/2015 at 17:06 #3043Jenny [Tobii]Participant
Since you need to do a custom screen setup and a custom calibration the only option is to use the low-level Gaze SDK. It will require some more work mapping normalized coordinates to virtual screen coordinates and the eye-gaze data from the eye tracker is raw. It is kind of possible to use both the Gaze SDK and the EyeX SDK from the same application, but there are some tricky things you need to know about: the EyeX Engine will overwrite any calibration loaded on the eye tracker when there is any kind of system event that invalidates the current configuration. For example if the resolution of a screen changes, if a monitor is added or removed from the system, if the user logs off and on again. If you can control for these things while running your application you could use the Gaze SDK for setting up the screen (only has to be done once) and doing the calibration (every time you start the application), and then use the EyeX SDK for the rest.03/06/2015 at 07:06 #3054Mark GatrellParticipant
Thankyou for your help on this.
I will experiment with your suggestions.
- You must be logged in to reply to this topic.