- 19/04/2016 at 05:03 #5057Ed ChangParticipant
Hi, I’d like to define regions on the screen where gaze is translated to key presses. This could be used to scroll a web page up and down, or play online browser games which usually require key presses.
I’m able to build and run the C++ sample EyeX application “ActivatableButtons”, but that’s not exactly what I need. Should I work on modifying that app, or is there a better example to start from?21/04/2016 at 09:48 #5069Alex [Tobii]Participant
You may try taking a look at this 3rd party project
If your want to code such functionality yourself you should start with these samples:
– TobiiEyeXSdk-DotNet-1.7.480.zip\source\WpfSamples\ActivatableElements & SDK documentation
– TobiiEyeXSdk-Cpp-1.7.480.zip\samples\ActivatableButtons if you prefer C++
I also want to mention that we don’t recommend doing completely hands-free interactions in most of the cases except accessibility applications. Preferred way would be to activate elements by clicking a button on a mouse/keyboard/gamepad22/04/2016 at 16:52 #5077Ed ChangParticipant
Those links are very helpful, thanks!
- You must be logged in to reply to this topic.