Home Forums Community Projects Clippy: accurate mouse control

Viewing 15 posts - 1 through 15 (of 31 total)
  • Author
    Posts
  • #6050
    Nate
    Participant

    I have a free, open source project called Clippy which started as a clipboard history and over time has grown into multifunctional productivity tool, specifically for programmers and other power users. I have now added mouse control using a Tobii Eye Tracker 4C. It differs from the Tobii software in that it combines eye tracking with head tracking. I think this combination works much better than eye tracking alone and that a lot of people would benefit if the Tobii software used this approach. You can check it out here:

    https://github.com/EsotericSoftware/clippy/tree/2.25#eye-tracking

    #6057
    Tristan Hume
    Participant

    Cool. I think you may be interested in the 4 months of experimentation and research I did on the subject of mouse control by combining eye tracking and head tracking. All the code I wrote is open source here: https://github.com/trishume/PolyMouse

    I used a variety of different methods of combining eye and head tracking, none of which are the same as yours because I was intent on supporting use without a hotkey and allowing for hovers and drags easily. My best system basically acts like a head tracking mouse but when you try and move long distances it uses eye tracking information to quickly move the cursor to where you are looking. With it I am as fast and accurate as I am using a nice Macbook trackpad, which is quite good and almost as fast as a mouse.

    I tried using a Tobii EyeX (Note to Tobii people: I only ever used it for real time interaction) with my system and unfortunately its head tracking support wasn’t accurate enough for really good control, and my IR head tracker interfered with its tracking. I eventually wrote my own visual light accurate head tracker that used a PS Eye: https://github.com/trishume/SmartHeadTracker

    Together the EyeX and the coloured dot tracker actually worked reasonably well, except near the corners of the screen where quality degrades.

    One thing I’m interested about the 4C is how much better the head tracking is? Is it the same as the eye position support of the EyeX? Or is it substantially better? Is there a new API for it or did the eye position accuracy just improve? Nate, not sure if you’ll be able to answer these questions, but if any Tobii people see this I would appreciate it if you knew and answered.

    #6058
    Nate
    Participant

    Neat, thanks for sharing! I figured I couldn’t be the only one to think of combining head and eye tracking. I am a bit surprised the Tobii software doesn’t have a mode to do it, considering how easy it is. My implementation:
    https://github.com/EsotericSoftware/clippy/blob/master/src/com/esotericsoftware/clippy/Tobii.java

    Interesting using sound to click/drag. I personally don’t have a problem with a hotkey. I use capslock. Otherwise it sounds like our approaches are similar, though I’m guessing your mouse control is always on?

    Many years ago I used a TrackIR for mouse control for some time. My coworkers were always amused when I’d go to lunch wearing a reflective bindi, and of course they never reminded me to take it off until after lunch. I really like not wearing a dot, hat, or even a headset. I used a hotkey with the TrackIR because with it always on I found myself trying to hold still to avoid the mouse bouncing around. Even with a hotkey, with my level of use (10-14 hour days) I found the TrackIR fatigued my neck and shoulders and eventually I gave it up. Since then I’ve been using a Kensington Expert Mouse (which is actually a trackball) for years, which has a great scroll ring.

    I don’t know if the 4C is more accurate with head tracking than previous versions, but it works very well for fine adjustments after using eye tracking for the large jump. Fine adjustments generally need to move less than ~120 pixels on a 2560×1440 screen from the gaze point, and I can accomplish this with high accuracy by moving my head just a few centimeters using the 4C. I don’t know if the head tracking accuracy is sufficient for moving the mouse across the entire screen, but given my TrackIR experience I’m not interested in that solution.

    I also don’t know if the API has changed, but it is as simple as retrieving the eye position values, so I expect any improvements would be with the device. My native code is here:
    https://github.com/EsotericSoftware/clippy/blob/master/jni/com.esotericsoftware.clippy.tobii.EyeX.c#L140-L148

    The next thing I want to try is to remember the correction distance done by head tracking after moving the mouse cursor to the gaze point, then interpolate using those correction distances to offset subsequent gaze points. In other words, I want to use the head tracking fine adjustments as calibration data to continuously improve eye tracking accuracy. Ultimately this is still limited to the 4C accuracy, but I wonder if my calibration would be superior to Tobii’s. At the very least, since my calibration is constantly improved, it may remove the need to recalibrate periodically.

    #6059
    Tristan Hume
    Participant

    My software has something the ability to use actual click locations to correct for bad calibration, but it isn’t perfect correction. In the past I have found that Tobii calibrations stay good for quite a while though, have you not found this?

    I’m now inclined to get a 4C myself and try and get my system to work with it.

    Have you actually been using this for day-to-day mousing instead of your expert mouse lately?

    #6060
    Nate
    Participant

    I’ve only had the 4C a couple days. I’m using it along with the Expert Mouse. I use either one, depending on what I’m doing. I try to use the 4C to avoid switching to the mouse and then back. Eg, while coding it’s nice to be able to type, click somewhere, and type some more without grabbing the mouse or jamming on the arrow keys. If I’m going to be mousing for a while, I’ll use the Expert Mouse. I also use the Expert Mouse scroll wheel for scrolling.

    It’s very fast to switch windows by look-clicking the Windows taskbar at the bottom of the screen. It’s also very fast to switch tabs by look-clicking. Both of those actions tend to be accurate because they are on the edge of the screen. If eye tracking chooses a point off screen, Windows prevents the mouse cursor from leaving the screen. This means I can just look, press, and release capslock and it almost always does the right thing, without needing to hold down capslock to use head tracking.

    #6063
    Nate
    Participant

    I tried using head tracking adjustments for calibration. It worked, sometimes quite well, but other times made accuracy worse. Calibration is a sensitive thing. If some bad data gets into it, it can be frustrating. Maybe it could be improved, but I took it out.

    That left me annoyed with the situation where I need to click the same button often. It sucks to need to use the same amount of head tracking every damn time. To help with this I added snapping, where for the last 50 clicks it remembers the initial eye position and the click position after head tracking adjustment. When the hotkey is pressed, if the eye position is close to one of the stored eye positions, it uses the stored click position rather than the eye position. Now I only need head tracking adjustment on the first click! This also helps when clicking something twice: look, hotkey, hotkey. There is a bit too much jitter to get a double click consistently though.

    Another thing I did was inspired by your projects, Tristan Hume. While holding the hotkey, if the gaze position is far from the mouse position, I jump the mouse to the gaze position. This can be nice if I hit the hotkey before looking where I actually want to go.

    Does anyone have other ideas for improving the hotkey-based system? It doesn’t have a good way to do double click or click+drag…

    #6064
    Tristan Hume
    Participant

    Sounds cool! A couple ideas:

    – That snapping is very similar to how my dynamic re-calibration works, except I do a more continuous weighting where I store a grid of offsets for different areas of the screen and use a weighted combination of the offsets to calculate the offset for a given point. See https://github.com/trishume/PolyMouse/blob/master/src/dlcTransformer.cpp

    – You could add a hotkey to toggle in to a mode like my system for dragging and double clicking where the mouse is constantly moving based on your head and eyes and you use your normal hotkey for clicking and dragging. Or even get a foot pedal.

    – When jumping the gaze to the mouse, I found it made it much easier to figure out where the cursor had gone if I animated the mouse (still very quickly) to the place instead of teleporting.

    – My system also only jumps the mouse to your gaze when you start moving your head, that way it isn’t distracting if you leave the system active.

    #6065
    Nate
    Participant

    A grid is an interesting idea, thanks! It’s similar to the calibration I was doing using arbitrary points. Bad data can still get into the grid, eg if you look at a position then use head tracking to place the mouse at a different position. However, I like that with a grid it’s easier to fix: setting a new head tracking offset erases the bad data for the grid cells involved. With arbitrary points, I didn’t have a way of deleting bad offsets.

    I gave using a grid a whirl. When head tracking is used to adjust the mouse, I store the offset in a cell for the initial gaze point. When using the gaze point to position the mouse, I find the cell for the gaze point (green) and the 3 other cells around it (using cell centers):

    I use the offset of these 4 cells to interpolate an offset for the gaze point at pixel granularity. The mouse gets placed at the gaze point + this offset. It seems to work pretty well! I need to live with it for a while, but I may not even need the snapping I described earlier.

    I see your grid is 8×6, though you do the weighting differently. I’m using 35×20 but haven’t experimented a whole lot. When I store the offset for a cell, I also store 66% of the offset in the surrounding 8 cells. This helps spread the offset out, with interpolation spreading it further. I initially tried a less dense grid without this, but didn’t like how it interpolated from a cell’s center to the next cell’s center when the next cell’s offset was zero.

    Code is in my Tobii class:
    https://github.com/EsotericSoftware/clippy/blob/master/src/com/esotericsoftware/clippy/Tobii.java

    For dragging I’m hesitant to have more than one hotkey or more hardware like a pedal. I think I can use a hotkey release then quickly press again and hold to start a drag, similar to a touchpad.

    For double click I think I can use a hotkey release then quickly press and release again. The second release would click a second time without triggering mouse movement.

    Animating the gaze jump is a good idea, thanks. I don’t use it much, since I normally look then hit the hotkey. However, I have noticed it can be confusing when the mouse is gaze jumped to the bottom of the screen. The mouse cursor can’t be seen at the bottom of the screen, so it appears like the mouse cursor has simply disappeared. Animating it into position would help the user understand what happened.

    Maybe I’m some kind of bobble head, but with mouse head tracking always enabled I get annoyed by the mouse cursor moving around as my head moves. I end up trying to keep still, which is fatiguing.

    #6078
    Nate
    Participant

    I’ve now implemented double click, double tap to drag, and animated mouse movement. The grid calibration is working great. I found snapping is still good for repeatedly clicking in the same place. I cleaned up the code (see Tobii.java) and moved all the settings to the top. I’m quite happy with how it works!

    The only feature I considered but didn’t implement is persisting the grid offsets so they aren’t lost if Clippy is restarted. Clippy isn’t normally restarted, so I think it’s OK to be without that for now.

    If you guys give Clippy a try, I’d be interested in what you think of the mouse control.

    #6079
    Tristan Hume
    Participant

    That looks awesome! Glad my experimentation has at least helped one person in a minor way. I plan on using my system myself at some point as well. It may be a month or more though since I need to get a larger SSD (to fit a Windows VM) and then a Tobii 4C first. I might end up using some of your hotkey-based mechanics as well, or maybe even porting your code to use OSX (should be easy since it’s Java). I have a program that streams Tobii data from a Windows VM to the host.

    #6081
    Alex [Tobii]
    Participant

    Hello!

    One thing Iโ€™m interested about the 4C is how much better the head tracking is? Is it the same as the eye position support of the EyeX? Or is it substantially better? Is there a new API for it or did the eye position accuracy just improve?

    Head tracking data in 4C is different from 3d eyeball positions and is much more precise. Head tracking API is not yet publicly available but we are working hard to release it.

    Currently you can test head tracking in some games (for example Elite Dangerous) using our Infinite Screen Extension.

    #6417
    Cliff
    Participant

    Clippy’s mouse control via 4C sounds exactly like what I want. I have the EyeX and had hoped it would help me keep my hands over the keyboard and avoid having to reach for the mouse. Unfortunately, the EyeX lacks the necessary accuracy. I kind of knew that already going in based on reading of these forums, but nevertheless, I felt the need to try it ๐Ÿ˜‰ (You can find my initial amateurish attempt using AutoHotKey here: http://developer.tobii.com/community/forums/topic/accessing-gaze-data-stream-with-autohotkey/)

    I do have a few questions:

    1) How difficult would it be to isolate mouse control from the rest of Clippy? I’m not really interested in the other functionality of Clippy. If need be, I’ll try my hand at isolating it myself if/when I get a 4C. (I’m hoping for a discount similar to one that I missed that was offered to fans of Elite Dangerous who already owned an EyeX. But, that may be wishful thinking.)

    2) Any idea if Clippy’s mouse control works with EyeX? (Obviously without the head tracking accuracy of 4C.) I can test this myself, but thought I’d ask first.

    3) You’ve already managed to incorporate head tracking into your mouse control (based on eye positioning I think), even though Alex points out that head tracking API is not yet available. When the API becomes available, would that simplify/improve things in any way on your end?

    #6419
    Nate
    Participant

    AHK is pretty painful to do anything, much less something this complex. ๐Ÿ™‚

    1) It would be simple to remove the other Clippy features (though they are awesome). All features can be disabled thru configuration except the clipboard history (which is fantastic). Else, to setup a dev environment, download Clippy’s source, download Eclipse, click import existing project. It’s easy.

    2) The SDK is supposed to work with the EyeX, so Clippy should too.

    3) If head tracking is more accurate, then that would be nice, but otherwise tracking the eye location works just fine.

    Since eye tracking is not accurate enough, it still comes down to hold a button, make sure the mouse is in the right place, let go of the button. This takes away a LOT of the magic when eye tracking is accurate, which feels like your are controlling the computer with your brain. Still, it’s pretty good and does reduce the need to grab the mouse.

    #6420
    Cliff
    Participant

    I’m not exactly a fan of AHK either, but given my lack of experience, it seemed like the path of least resistance at the time. I was probably wrong ๐Ÿ˜‰

    I’ll give Clippy a try. Who knows, maybe I’ll prefer it over my current clipboard manager, Clipjump. (Which happens to have been written in AHK, but don’t hold that against it.) Clipjump came closest to what I was looking for in a clipboard manager — let’s see if Clippy can best it ๐Ÿ˜‰

    #6421
    Nate
    Participant

    AHK has it’s place, I use it for a couple things (hotkeys, monitoring new windows, auto pressing a button). It’s just that for more than a little script, it is very painful.

    From screenshots, Clippy is likely a bit more minimalistic than Clipjump. However, it’s that way because I don’t need or want tons of features. I want to popup the history dialog, type something to search, and paste it — no fluff. I’ve considered adding “favorites”, so you can star a clip and have it appear at the top of the list. Otherwise, I use Clippy daily and don’t miss any of the features that ArsClip, ClipClip, etc have. Plus Clippy does more, like uploading text or files to FTP so you can easily paste a link to someone (copy as normal, then hit Clippy’s upload hotkey), or taking a screenshot, uploading to FTP/imgur/etc, and pasting a link.

Viewing 15 posts - 1 through 15 (of 31 total)
  • You must be logged in to reply to this topic.