Home › Forums › Community Projects › Clippy: accurate mouse control › Reply To: Clippy: accurate mouse control
Neat, thanks for sharing! I figured I couldn’t be the only one to think of combining head and eye tracking. I am a bit surprised the Tobii software doesn’t have a mode to do it, considering how easy it is. My implementation:
https://github.com/EsotericSoftware/clippy/blob/master/src/com/esotericsoftware/clippy/Tobii.java
Interesting using sound to click/drag. I personally don’t have a problem with a hotkey. I use capslock. Otherwise it sounds like our approaches are similar, though I’m guessing your mouse control is always on?
Many years ago I used a TrackIR for mouse control for some time. My coworkers were always amused when I’d go to lunch wearing a reflective bindi, and of course they never reminded me to take it off until after lunch. I really like not wearing a dot, hat, or even a headset. I used a hotkey with the TrackIR because with it always on I found myself trying to hold still to avoid the mouse bouncing around. Even with a hotkey, with my level of use (10-14 hour days) I found the TrackIR fatigued my neck and shoulders and eventually I gave it up. Since then I’ve been using a Kensington Expert Mouse (which is actually a trackball) for years, which has a great scroll ring.
I don’t know if the 4C is more accurate with head tracking than previous versions, but it works very well for fine adjustments after using eye tracking for the large jump. Fine adjustments generally need to move less than ~120 pixels on a 2560×1440 screen from the gaze point, and I can accomplish this with high accuracy by moving my head just a few centimeters using the 4C. I don’t know if the head tracking accuracy is sufficient for moving the mouse across the entire screen, but given my TrackIR experience I’m not interested in that solution.
I also don’t know if the API has changed, but it is as simple as retrieving the eye position values, so I expect any improvements would be with the device. My native code is here:
https://github.com/EsotericSoftware/clippy/blob/master/jni/com.esotericsoftware.clippy.tobii.EyeX.c#L140-L148
The next thing I want to try is to remember the correction distance done by head tracking after moving the mouse cursor to the gaze point, then interpolate using those correction distances to offset subsequent gaze points. In other words, I want to use the head tracking fine adjustments as calibration data to continuously improve eye tracking accuracy. Ultimately this is still limited to the 4C accuracy, but I wonder if my calibration would be superior to Tobii’s. At the very least, since my calibration is constantly improved, it may remove the need to recalibrate periodically.