Home Forums Feature Requests A wishlist of features

Tagged: 

Viewing 3 posts - 1 through 3 (of 3 total)
  • Author
    Posts
  • #2998
    Christopher Lee
    Participant

    I haven’t purchased an EyeX product yet, but I’ve become very interested in learning more ever since I read about the SteelSeries and EyeX products. Until I get my hands one, I tried to come up with a list of ideas that I’d love to see such an eye tracking device be able to do, to improve workflow.

    Some of these ideas rely on a combination of products, but bear with me:
    Mouse teleportation in combination with VAC. (I see there is already work on that.)

    An option for clicking with eyes. We have two eyes. There are two major mouse buttons…

    Left and Right! Calibrate what a single natural eye close looks like (rather than an exaggerated, one eye open/ one eye closed) vs a normal both eyes close action. Or perhaps even better yet, a single eye twitch action, reducing the strain of constantly closing your eyes to use the PC. (perhaps a double click could be a full single eye close?)

    EyeX cursor zoom option, perhaps with a gesture or… Maybe a product like Leap Motion could allow for easy zooming of where the EyeX is tracking. Simply slightly reach into space to move in and then left blink to click.

    Or better yet, if squinting could be detected. Two eye squinting for a standard zoom might be a great option. (i.e. if you sat at a set distance, squinting, then slightly leaning in along with some other action, such a key press or squint). If EyeX couldn’t determine a squint alone, I wonder if a webcam could assist with gesture detection…

    Dragging windows with eyesight alone would really be useful too. Because it is / it’d be a pain to grab the top of the window to move each window. There needs to be a shortcut to move the focused window and have it drag wherever you look till you release it. Again the Leap Motion could be handy (or maybe just holding down keyboard ctrl + EyeX gesture. Or maybe a VAC command to the currently EyeX focused window.)

    Related to VAC, it would be great if could use EyeX to look at an area, and then read an option or link that you see in the specified EyeX focus. “Calibration language interface”, and bam, it clicks the link in the visual EyeX focus.

    Also mentioned already, mutli-monitor support would be highly useful. But I don’t like the idea of having bunches of eye tracking hardware on multiple monitors constantly shooting my eyes over a long period of time. Unless the infrared is low enough to cause zero strain…

    Perhaps with the combination of a webcam, you could calibrate the head position with which monitor is active, and disable the non-active EyeX trackers. (Imagine a 6 monitor setup constantly blasting at you!) Plus I’m not sure how EyeX works. Does it track constantly? Or does it/can you have it only activate as wanted?

    Ideally, I’d like to see a single tracker calibrate to other monitors. I’m not sure if a webcam to monitor the head would help to focus which monitor is active. Even if the EyeX calibration accuracy becomes highly reduced on a second monitor, if I could still hit a large target like: I want to drag a window from one monitor to the next, just being able to jump from one active window to another, on another screen would be a simple but hugely useful benefit. Though focus to another window shouldn’t be activated by just staring. If you are looking at one monitor, and typing on the other, you definitely don’t want to lose focus!

    And how about an auto-dimming option? If my focus goes to another monitor or window, perhaps a slow dim of the inactive monitor could be a handy option. And if something changes visually on the inactive monitor it could brighten. Or pulsate gently until viewed.

    And somewhat related, how about an auto-focus reveal feature. If a window is covered up, and you look at the part of the window peeking out, the windows on top of the covered app become transparent. So you can see it until you look away from that app, or use a mouse focus action.

    Finally, gestures for eye movement as commands might be a neat possibility. A gesture you don’t usually do, maybe a quick trace of the monitors edges in a certain direction. Or a visual X or O. Or how about looking slightly down from the display and tracing an imaginary gesture off screen.

    And of course finally it’d be great to be able to use EyeX to assist in Photoshop. For example maybe you need to trace the outlines of an image. By zooming in greatly, you should be able to keep focused on the outline of the image as you use another tool like the mouse or leap motion to scroll the image.

    Bottom line,
    I’m hugely excited at what eye tracking technology might do in the future, now that it nearly affordable as a standard desktop accessory. This might be the year that virtual reality starts to grab a foot-hold, but outside of gaming I see a lot of potential.

    #3083
    Jenny [Tobii]
    Participant

    Hi Christopher,

    Thanks for your input. Some of your concepts we have actually already evaluated and have working prototypes for. They might end up in a product some time in the future.

    #3220
    Austin
    Participant

    A data stream that returns the X and Y coordinates of EACH EYE.

Viewing 3 posts - 1 through 3 (of 3 total)
  • You must be logged in to reply to this topic.