Cool. I think you may be interested in the 4 months of experimentation and research I did on the subject of mouse control by combining eye tracking and head tracking. All the code I wrote is open source here: https://github.com/trishume/PolyMouse
I used a variety of different methods of combining eye and head tracking, none of which are the same as yours because I was intent on supporting use without a hotkey and allowing for hovers and drags easily. My best system basically acts like a head tracking mouse but when you try and move long distances it uses eye tracking information to quickly move the cursor to where you are looking. With it I am as fast and accurate as I am using a nice Macbook trackpad, which is quite good and almost as fast as a mouse.
I tried using a Tobii EyeX (Note to Tobii people: I only ever used it for real time interaction) with my system and unfortunately its head tracking support wasn’t accurate enough for really good control, and my IR head tracker interfered with its tracking. I eventually wrote my own visual light accurate head tracker that used a PS Eye: https://github.com/trishume/SmartHeadTracker
Together the EyeX and the coloured dot tracker actually worked reasonably well, except near the corners of the screen where quality degrades.
One thing I’m interested about the 4C is how much better the head tracking is? Is it the same as the eye position support of the EyeX? Or is it substantially better? Is there a new API for it or did the eye position accuracy just improve? Nate, not sure if you’ll be able to answer these questions, but if any Tobii people see this I would appreciate it if you knew and answered.