- 05/01/2014 at 01:37 #177patrickb323Participant
I just downloaded and installed the EyeX Engine 0.8.3.249 and am using with a Rex device. Everything seems to calibrate properly and is running fine. It doesn’t do anything though? What should I expect when looking at my desktop? I would expect the mouse pointer to move or when I look at a window and left click it should be selected. I am on Windows 7, will this only work on Windows 8?05/01/2014 at 09:23 #178Robert [Tobii]Participant
Great to hear that you managed to start the engine. Upcoming versions of the EyeX software will have many cool functions to use your eyes for interaction in the regular Windows environment (both Win7 and Win8), but the EyeX Engine developer preview only contains the backend component. So the only way to see any interaction examples is to compile and run the code samples included in the SDK packages.
We will show a beta version of the complete EyeX software on the CES exhibition next week. There will also be a beta program during spring, where you will be able to try out the release before everyone else. If you are interested in trying out the beta, we will post information about this on the Developer Blog.30/01/2014 at 17:36 #314VeronikaParticipant
I have installed EyeX software on my computer and tried to launch the provided samples. The samples compiled and launched well. However, nothing happens much inside the applications. For example, ActivatableButtons sample, in the program two buttons are supposed to change the background color. However, nothing happens while i am looking at any of the buttons. I tried to debug the code, what i saw is that in the WndProc function no events appear when i am looking anywhere at the screen, or at the buttons, however the window title shows me “use your eyes”, which I suppose means, that the eyeX engine was accessed successfully.
Similar issue with the X-O game program. I can see that a cell at which i am looking becomes slightly lighter, but in order to put a mark i still need to click with my mouse on the board, the mark though is created in the cell where the mouse click was performed. So i am not sure after all, if the program functioning correctly.
I am wondering, if this behavior of the programs is a problem of miscofiguration, or have i missed something during installation or launching? The Tobii itself works fine and it can detect my gaze in the tobii configuration window.
My question is also about reliability of EyeX software. As i understood, it is newly released functionality. Should i better use Tobii Gaze SDK or should i keep trying with EyeX? My initial intention is to develop a winodows application which would have certain activity depending on the user’s gaze, the development should be started asap.
Veronika30/01/2014 at 20:35 #315gigertronParticipant
If you want to control your mouse, you can just add a SetCursor(x,y) line to one of the samples in the SDK. I’ve been doing this all week and it’s pretty usable with a little extra coding wrapped around it. I’ve got video examples up in the community projects forum.30/01/2014 at 22:37 #316Jenny [Tobii]Participant
The ActivatableButtons and the X-O game samples make use of the so called Activatable Behavior (or EyeX Click). Nothing happens until the trigger key for the Activatable Behavior is pressed, which is the Applications Key:
Activating an activatable interactor
So, the idea with this behavior is that you look at the button (or region) you want to activate, and then press the applications key to actually click it.
This is an example of a multimodal interaction that cuts away the need for a mouse cursor altogether. Just point with your eyes, and click with a key press.
An advantage of using the Activatable Behavior is that it has a built in snapping behavior, so your eye-gaze only need to be close enough to activate a button, and that it does some other clever processing behind the scenes.
Included in the Activatable Behavior is also events that you can use to highlight a region when it has gaze focus. You can read more about the behavior in the Developer’s Guide.
Regarding the reliability of the EyeX software, I would say that the functionality we have written samples for are reliable to build on. We plan to expand the set of samples as the API matures and features are added.
For your application, maybe you could make use of the Gaze-Aware behavior, which generates events when the user’s eye-gaze enters and leaves a specific region on the screen? This behavior is also described in the Developer’s Guide. Depending on what you want to achieve one of the data streams may be suitable as well.
I hope that answers your questions and that the samples work as intended using the application key to trigger the EyeX Click.31/01/2014 at 15:54 #325VeronikaParticipant
Thank you for the detailed answer. The samples work fine with the Applications key.
My question is now related to trigger keys for activating behaviors.
As i see in the documentation: ” by default, the EyeX Click is assigned to the Windows Applications/Context key … Future releases of the engine may provide additional ways of triggering activation, such as touchpad gestures, or voice commands.”
Do I understand correctly that at the moment no other keys except Applications key can be triggers, also no other hardware/software can be a trigger, e.g. a touchpad.
Thank you, Veronika31/01/2014 at 17:47 #327Jenny [Tobii]Participant
Great that you got the samples working!
Regarding trigger keys:
At the moment the Applications key is the only trigger key for Activatable Behavior. Different behaviors have different trigger keys. There is no other hardware input like touchpad that is supported right now. There is a way to trigger an activation programmatically, but then the client application has to do the handling of the user input itself.
Functionality to be able to change the default keys is under development.
- You must be logged in to reply to this topic.