Sorry to bump an old thread, but I am very interested in this subject myself. I was just poring over the sample code trying to figure out how to use the EyeX as a head tracker. Right now I’m stuck on how to convert from UCS (user coordinates) to something that is relative to the center of the display (how do I detect and remove the pitch of the sensor). I’m sure a dive into the calibration data will get me there in the end.
I’m really wondering if you could not ID other features on the face, such as the ears, in order to support a broader range of tracking. Most head trackers have a frustratingly small window that they work in and anything you can do to broaden the range would be good.
I feel that one of the best uses for the EyeX in video games would be head tracking without needing a reflector, basically a specialized version of the Kinects that only concentrates on the head itself. That would allow you to use your monitor as a window into a 3D world and would really break you out of the box.
Traditionally head trackers have been missuses to aim the camera, but really what you want is to track your head in order to create a ‘window’ into the 3D world so all you need to track is the x,y,z position of your head, relative to the center of the display (well really you want to know the physical view frustum, that is a tiny bit different)