Home › Forums › Community Projects › Clippy: accurate mouse control
- This topic has 30 replies, 8 voices, and was last updated 5 years, 3 months ago by Nate.
- 07/03/2017 at 00:43 #6423CliffParticipant
I can confirm that Clippy’s mouse control works with my EyeX, although accuracy is lacking, as expected. Still, I’ll try using it for a longer period of time to see if it helps reduce my dependence on the mouse at all. I still hope to get a 4C eventually, though.
As for clipboard management, I’ll keep using Clippy for that, too, to see if I warm up to it. I should point out that Clipjump doesn’t do exactly what I want either. It probably has a lot of extra functionality that I don’t use. The primary feature I use is CTRL+V to paste the last item; CTRL+V,V to paste the 2nd last item; CTRL+V,V,V to paste the 3rd last item; etc. I don’t really want any UI to get in my way.
My ideal clipboard manager would be very much like Registers from jEdit (which I think is based on the concept of Registers from Emacs). Basically, there are multiple, independent clipboards. With easy-to-use key combos, I can copy/cut/paste directly to/from any clipboard. But, I want that functionality system-wide.
Anyway, this is probably getting off-topic. If I keep using Clippy for its clipboard management, I may poke you with some ideas/requests via GitHub. Or, I might try my hand at using it as a basis to implement my ideal clipboard manager.
Thanks a bunch!07/03/2017 at 10:50 #6426
Great, I’d be interested in how you feel about Clippy’s eye tracking (and other features) after some time. The hardest thing for me is to remember not to grab the mouse.
Clippy doesn’t quite do what you want (multiple hotkeys for separate keyboards/registers), though it could with minor adjustments. The code is very well organized. To paste your 2nd last item: ctrl+shift+V, alt, 2 (alt+2 also works). 3rd last item: ctrl+shift+V, alt, 3. ctrl+shift+V opens the popup, alt shows the popup menu, and number keys choose an items. Also you can do ctrl+shift+V, down, enter. The “lock order” checkbox (shown after pressing alt) can be useful when using the number keys a lot, so the items keep their same number keys.02/04/2017 at 16:09 #6594aashtonkParticipant
@nate Clippy looks like a very useful program and would like to try it and give you some feedback. Sounds like just the thing I’ve been looking for. I haven’t done a whole lot in Java and am having trouble getting it to work. I have the java SDK 1.8 and my JAVA_HOME is configured. I’m running Windows 10. When I try to start it from the Clippy.exe or manually start Clippy.jar it doesn’t give me any errors or warnings. Nothing seems to happen. I don’t see any new icons in the system notification area either. Any thoughts?03/04/2017 at 03:12 #6596
@aashtonk Look in: c:\users\USERNAME\.clippy\clippy.log
If it’s a null pointer exception, look in config.json for “gamma: null” and change it like “gamma: ”. I’ll fix that bug eventually but I’m traveling right now.04/04/2017 at 08:04 #6600aashtonkParticipant
@nate Null pointer exception was the problem. Very cool what you have build here! Tracking the tobii paired with the head movement is an awesome feature!
One feature I might request is the ability to disable the “head tracking adjustments” once I have it moderately well trained. Sometime when I like to be jump around to parts of the screen generally and not be as precise. For example when I’m just changing the focus of a window. During these times I worry I am training it poorly.07/04/2017 at 04:52 #6630
@aashtonk I’m glad you like it! 🙂
While doing head tracking, if you move your physical, hardware mouse at all, then it won’t store any training data. However, don’t worry too much about the training. The eye tracking is sensitive and may give slightly different results simply by you shifting your seating position.
Training should be natural and not something you need to be conscious of: you just look, hold the hotkey, move your head if needed, then release to click. The next time you look at that area it will likely be more accurate. If you do get bad training data, the next time you look at that area it may be slightly inaccurate initially, but if you just move your head to fix it, the bad data is gone.29/07/2017 at 19:10 #7217
I’m having an issue, I download Clippy.exe, Clippy.jar and the 2.34 Source Code. I put them into the same folder on my desktop. I open Clippy-2.34 > Src > com > esotericsoftware > clippy, then Config.java, which i use wordpad to open. I adjust the public int breakWarningMinutes from 55 minutes to 100, Save changes, open up Clippy.exe/Clippy.jar and the changes aren’t being applied. When i highlight the icon in the Taskbar with my mouse, it still says “Active:0 minutes , Break in: 55 minutes. Why aren’t my changes being applied? Thanks for the help!29/07/2017 at 19:21 #7218WilliamParticipant
@jamestaylor you want to change the configuration file not the source code (source code is for compiling). The configuration file is at “C:\Users\<username>\.clippy\config.json”.
@Nate great work btw!31/07/2017 at 07:51 #7221
haha I literally JUST found the file before i refreshed the page! That’s what you call timing! I came back to remove my submission31/07/2017 at 07:57 #7222
I’m looking to use this program as a way of using Gimbaled weapons during flight in Star Citizen, has anyone tried it and if so, was it successful?02/08/2017 at 03:45 #7230
Wow that’s really cool to see people interested in using head tracking and eye tracking together! Before I saw this forum I created my own version that is focused specifically on mouse emulation. It can use a variety of head trackers including EyeX 4C, TrackIR, and SmartNav. The mouse follows where you look around the screen. It doesn’t have the teleport feature on a mouse press but I like how you do it Nate! Check it out at http://precisiongazemouse.com/
Also, I would like to try Clippy but I’d rather not install Java because it constantly needs to be updated. I’ll add an issue to your Github project to make a version that doesn’t require a JRE.02/08/2017 at 05:19 #7231Tristan HumeParticipant
Henry that’s pretty awesome. There’s two important things I’ve learned from your system:
The Eye Tracker 4C’s head tracking isn’t as precise and responsive as the TrackIR, and wouldn’t be very good for a combo system. I was planning on buying one to test this, but now I don’t have to. I might still buy one anyway though because of the other better things, like lower CPU load for plain eye tracking.
Somehow your EyeX works in combination with the TrackIR. I wrote off this approach because when I tried the TrackIR with my Eye Tribe tracker it got totally broken by the extra glints the lights in the TrackIR created in my eyes. I forget if I tested it with the EyeX as well and it didn’t work, or just totally wrote it off as fundamentally incompatible with remote eye trackers. I’ll have to try again now, have you just never noticed any problems with this? This is awesome news because this (potentially imagined) incompatibility was the main roadblock for my using the system.
Also may I recommend you take a look at my https://github.com/trishume/PolyMouse project’s source code. I use an IMO better way of deciding when, how and where to warp the mouse cursor. It animates the cursor to the location (quickly) so your eyes don’t lose track of it, and only when you start to move your head. It also only animates it to the edge of a circle around the gaze point, so that it doesn’t overshoot and you never have to change directions of your head movement, only fine tune it into position. I’ve done a bunch of pointer speed testing experiments with myself and other participants that suggest this method is faster and more pleasant to use. I encourage you to adopt it, I think you’ll like it.
I can’t wait for Tobii to release the streaming API for OSX that they say is coming. Once that happens I’ll guaranteed buy a 4C and set up a mouse/head tracking system on OSX with it.
Email me at [email protected] if you have any questions.03/08/2017 at 01:54 #7233
@trishume That’s a great idea to move the mouse pointer to the edge of a circle around the gaze point and then continue with head motion. I don’t have a Mac so by any chance do you have any videos you are able to share? It would be great to see it in action. One potential issue I’m wondering about is how this works with small movements like on the keys of an on screen keyboard. If there is some threshold of head movement that initiates a mouse movement, is it able to detect small movements between keys? With the Precision Gaze Mouse it is continuously tracking, so you’re able to hit the keys of an on-screen keyboard without making big movements.
Yes I got mine working with the TrackIR 5 and EyeX 4C together. It works best if I wear contacts instead of glasses and use the TrackClip Pro which has IR lights on it. They are bright enough that they are not distracted by the reflections from the EyeX.03/08/2017 at 04:41 #7234Tristan HumeParticipant
@Henry no I don’t have a video sorry. Your best bet is probably just to look at the code and infer behaviour from that. The relevant code is here: https://github.com/trishume/PolyMouse/blob/master/src/animatedMagicPipeline.cpp
The closest I can give you is to tell you to try using http://www.yorku.ca/mack/FittsLawSoftware/ with your system and your mouse and get a feel for the scores (the settings I use are here: https://gist.github.com/trishume/3755fea9b0cd7460701d63398a8e3a04). Now for comparison I get 3.2bits/s with pure head tracking, 3.7bits/s with a macbook trackpad, 4.5bits/s with my system and 5.0bits/s with a mouse. My other pilot participant reached 3.9bits/s with my system after a couple hours of practice.
To answer your question about small movements, similar to your system, it has a circle within which it won’t trigger the animated warping. For increased robustness it actually has two circles, one around the current cursor location and one around the last warp destination, so whether your eyes track the cursor or stay at a fixation point, it won’t trigger unwanted small warps. So small movements are done purely with head tracking.
The fact that you use the TrackClip Pro makes a lot of sense. I hadn’t thought of using it but now that you’ve clued me in, that’s a great solution, since it won’t create extra glints in your eyes. I’ll probably buy one or make one.10/08/2017 at 19:12 #7261
After trying Clippy out I really liked how @Nate was moving the mouse after a key press, and how he handled dragging and double-clicking. However, I found the accuracy of the 4C head tracking to be lacking, especially around the corners of the screen. I wanted to try it with the improved accuracy of the TrackIR so I added support for moving after a key press to Precision Gaze Mouse. @Cliff give this version a try for better accuracy http://precisiongazemouse.com/
- You must be logged in to reply to this topic.