Has anyone of you tried to interface with the eyeX using an Arduino? Is it doable?
We would like the arduino to process the gaze data from the eye tracker and use it as input to the controller of a wheelchair.
We have already an arduino set up with a matlab interface but now we would like to skip the matlab phase altogether and purely use the gaze data from the eye tracker to directly control the wheelchair via the arduino.
Interesting question. I would like to use Arduino with eye tracking myself 🙂
Unfortunately I do not think it is possible due to a number of things:
– EyeX Engine can only run on a Windows computer where it uses the multi-threading capabilities of the system. The AVR microcontroller on an Arduino does not even have an operating system and its hardware does not support multi-threading at all (only hardware interupts).
– If you would use the Gaze SDK instead of the EyeX SDK, you would still not be able to use it with Arduino because the Gaze SDK requires at least two threads to run simultaneously: the main thread and the eye tracker event loop,
– The most fundamental problem of all is that the EyeX Controller requires a device driver to function – there are no device drivers for Arduino, and my best guess is that Tobii will not develop any due to the above restrictions.
What if you used UE4Duino plugin with EyeX SDK plugin with Unreal Engine? Through blueprints couldn’t any interaction with those two devices be possible? Maybe not accessing the EyeX hardware directly but you would have access to Gaze point data. Depending on how you want to actuate or what you want to do. I was able to get both plugins to work with Unreal.
Yes, to actually run the gaze-enabled application on Windows and then communicate with the Arduino from Windows should work.
The next question would be: Would the user interact with the application via an graphical interface on a computer monitor? If yes, EyeX could be used, if no, the Gaze SDK would have to be used (to be able to do a custom screen setup).