Home Forums Feature Requests Voice control + pointer button teleport

Viewing 15 posts - 1 through 15 (of 21 total)
  • Author
    Posts
  • #1064
    iWANTiTRACKING
    Participant

    Pointer button teleport appears to work quite well when using the keyboard but when attempting to trigger it through voice commands the voice command will not trigger the cursor movement.

    I am using VAC for voice controls. The voice controls work when the eye tracker is off. When the eye tracking software is on, whichever button is associated with the eye tracker pointer movement does not trigger by voice but still triggers by keyboard. I believe this could be in part because of how the eye tracking software disables the physical button that is chosen. Is there an option to not have the key that triggers the pointer movement be disabled?

    If you could point me to the location in the code that handles this I am interested in working to edit it as well.

    Have you tested the eye tracker with other voice control applications?

    Thank you,
    Tim

    #1069
    Anders
    Participant

    Hi Tim,
    do I understand correctly that you’re using the VAC voice control software and inject a key press event whenever a voice command is received?

    #1076
    iWANTiTRACKING
    Participant

    Correct, VAC voice control software is producing the key press event and I have tied the same key to voice command as Tobii’s pointer button teleport so that it will trigger it.

    Thanks!
    Tim

    #1077
    Robert [Tobii]
    Participant

    Hi Tim,

    Thank you for the clarification. In general, we recommend to develop applications using the EyeX Engine API and not by extending functionality in EyeX for Windows. There are two reasons for this:
    1) EyeX for Windows might be disabled or unavailable on some systems.
    2) EyeX for Windows is under development and future versions might not be backwards-compatible.

    I would recommend to build the “voice teleport” function yourself. You only need to subscribe to the gaze data stream (take a look at the samples in the SDK packages) and, when you get the voice command, use the Win32 function SetCursorPos to teleport the cursor to the latest (x,y) coordinates from the data stream.

    #1181
    James Stout
    Participant

    I ran into the same problem and implemented a workaround as you suggested. However, there is an important point here that should not be missed: we should be able to trigger the direct click functionality via programmatic keypresses, not just physical keypresses. This has nothing to do with Eyex for Windows, since the direct click concept (activation key) is part of the API. Otherwise, software designed with your API will not be accessible, e.g. by Dragon NaturallySpeaking users.

    Specifically, you can reproduce this by trying to send the direct click programmatically using the standard SendInput function:
    http://msdn.microsoft.com/en-us/library/windows/desktop/ms646310(v=vs.85).aspx

    I can remap the direct click, but that doesn’t fix it. I’m a heavy Dragon NaturallySpeaking user and this is the first time I’ve ever seen this happen, so it would be a big surprise for many users.

    #1182
    Robert [Tobii]
    Participant

    Hi James,

    You are right, the Direct Click (activation) command is exposed via the API and it is important that it can be triggered programmatically. Luckily, there is a way to do this without sending key presses by using “action commands”.

    In the ActivatibleButtons project in the C/C++ sample solution you can see it in action:

    
    void EyeXHost::TriggerActivation()
    {
    	TX_HANDLE command(TX_EMPTY_HANDLE);
    	txCreateActionCommand(_context, &command, TX_ACTIONTYPE_ACTIVATE);
    	txExecuteCommandAsync(command, NULL, NULL);
    	txReleaseObject(&command);
    }
    

    There are many other action commands you can send as well, check out the TX_ACTIONTYPE enum.

    #1188
    James Stout
    Participant

    Thanks. I did find this, but this will not work for the majority of users with accessibility needs, who are not programmers. Any software that simulates keypresses would have to add specific support for your API, which is unrealistic.

    All I’m requesting is that the engine responds to the standard Windows API for simulated keypresses. I haven’t seen this fail in any other application.

    Thanks,
    James

    #1189
    Robert [Tobii]
    Participant

    Ok, thanks for clarifying. I have filed the feature request to the EyeX Engine team.

    #1398
    iWANTiTRACKING
    Participant

    Is the feature included in eyex engine 0.8.17?
    Thanks!

    #1414
    Jenny [Tobii]
    Participant

    Hi Tim,

    No, it is not included in the EyeX Engine 0.8.17. We have simply forwarded the feature request to the engine team. There is no guarantee that they will implement this feature. It is not trivial in applications which themselves inject key presses to also listen to injected key presses, since you can end up in an infinite loop of injecting and listening to your own key presses.

    #1734
    Tim
    Participant

    Understood, could it be implemented similarly to the other features you have available like the pointer button teleport where it is set to disabled by default but can be enabled if a user chooses to? If so any looping due to injected key presses could be disabled if the software being used did encounter those types of issues.

    Thank you!
    Tim

    #1895
    Tim
    Participant

    Has anyone gotten voice control to work with the pointer button teleport or knows which piece of the code controls the disabling of the button’s function so that it cannot be used with other applications?

    “Tobii EyeX interactions combine eye tracking with input from traditional controls. Use your eyes to navigate and select. Then execute with a key, touchpad or voice command.” I want to execute with voice. Anyone working on their own voice control integration I would be interested in testing it or that has found the pointer button teleport options reference to the disabling.

    Thanks!
    Tim

    #1901
    James Stout
    Participant

    I don’t think there’s an easy fix to this yet. I’m using a very complicated fix, where I created a DLL to provide some basic functionality atop their API, and then I use Python extensions to Dragon to call into this DLL, completely bypassing the key bindings. This is really something Tobii ought to address directly. I realize it’s not trivial, but they could keep track of their own simulated keypresses to avoid an infinite loop. And honestly I’m not even convinced that’s necessary. This is only a problem if the simulated keypresses their API listens for trigger other simulated keypresses. Just listen for the subset we are requesting here, and there won’t be a problem.

    #1910
    Robert [Tobii]
    Participant

    Hi,

    Great to hear that you are working on voice control integration. I think it a really interesting use case.

    However, in the new EyeX 1.0.0 release, published yesterday, we have done some changes in responsibilities between the EyeX Engine and EyeX Interaction (formerly known as EyeX for Windows) that you need to know about:

    • From now on, the EyeX Engine does not hook any keyboard keys at all. The EyeX Engine API just exposes the Activatable behavior with action commands to trigger activation or to toggle activation mode/activation focus on or off.
    • The EyeX Interaction applications hooks one or two buttons, depending on the Direct Interaction settings. In one button mode, a quick tap on the activation key sends an activation command immediately. In two button mode, you can hold down the activation key to get activation focus and not activate until you release the key

    Since the pointer teleport functionality also resides in the EyeX Interaction software and since this software does not have any API, we no not recommend to build any voice integration that is dependent on neither direct click or pointer teleport.

    As I said before, I would recommend to build a little proxy program in between that:

    • Registers as an EyeX Engine Client so it can receive gaze data and send action commands.
    • Registers low level hooks for 3 keys: move cursor, toggle activation focus and activate.
    • If a “move cursor key” is received: SetCursorPos(lastGazeX, lastGazeY).
    • If a “toggle activation key” is received, turn on the activation focus via an action command.
    • If an “activate key” is received, send an activation action command.
    • Then you can user Dragon or VAC or whatever to listen for voice commands and send key presses that your proxy program can handle.

      If you take inspiration from the C code sample MinimalGazeDataStream, you can probably build such a program in a couple of hours.

    #1914
    James Stout
    Participant

    Just so we’re on the same page, what you described is exactly what I built. It didn’t take long, and I did base it on the C code example you mentioned. Nevertheless, I still think that for the average voice-recognition user it’s asking a lot for them to reimplement the EyeX Interaction hooks themselves. If the software could respond to simulated keypresses like every other application I use, it would not have required any custom code. I’m a developer, and this is an SDK, so I’m happy to do this myself, but this is really something that should be addressed for the end product.

Viewing 15 posts - 1 through 15 (of 21 total)
  • You must be logged in to reply to this topic.