Home Forums Feature Requests 3D Projection

Viewing 2 posts - 1 through 2 (of 2 total)
  • Author
  • #1240
    David Tucker

    I am a developer at iRacng.com, an online race car simulator. My boss recently had me pick up the eyeX and I’m struggling to find a use for it in our relatively narrow application space. Because our game relies heavily on a steering wheel and pedals for input we don’t really need to target on screen objects or use the mouse for interaction. I thought about using it to help train new drivers to properly look down the track when racing, but that is just as easily accomplished with a ‘bouncing ball’ that draws there eyes to the correct location and would only need to be used once.

    Then it hit me, we provide a very realistic render engine with all geometry being scaled to real world dimensions. However the average user does not benefit from that effort because there monitor is not properly setup with the correct FOV in order to scale the world. In addition we support multiple monitor setups with a separate render for each monitor in order to truly immerse you in our world. But that makes things even worse because you now have three monitors to locate and position just right in order to create a seamless rendered world. It would be nice if you could just look at the monitor(s) in some magic way and we could in turn figure out exactly where your head is in relation to the monitors so that we could properly position our cameras and create a seamless experience.

    That is all great but the current eye tracker does not seem to work quite good enough to get there. Ideally we would like to go through a calibration process where the user first derives the dimension of there monitor(s) by measuring the marks on the eyex device relative to on screen graphics. Then have the user look at all 4 corners of each monitor, using the provided data to come up with a relative distance and angle to each of up to 3 monitors. We actually don’t even need to know the size of the monitors, only there horizontal/vertical FOV’s and the angle between monitors. Given that information we could adjust the projection to fit quite accuratly. And as an extra bonus step we could continue to track the head position (based on the eyes) to create a ‘trackIR’ style experience without the need to use reflectors allowing the user to reposition there head in order to see around obstructions or get a more ‘rift like’ 3D experience without the burden of wearing a headset.

    To get to the point, we need to be able to track much farther than the +/-30 horizontal degrees that the current eyex seems to support. Ideally we would be able to track the eye/head up to a full 180 degrees of horizontal motion (+/-90 degrees). I don’t know if this could be possible with a single eyex device, or if you would need to gang up two or three eye trackers. In addition the calibration phase would need to be extended in order to support multiple displays.

    I think there is real potential here in being able to walk up to any monitor and treat it like a true view into another world with full head tracking. But it needs to have a fairly large operating window if it is going to truly be successful and the setup and configuration needs to be simple enough that the casual user is not frustrated with the process.



    Hi David,
    thanks for posting — I’ll forward this to our hardware and firmware engineers.

Viewing 2 posts - 1 through 2 (of 2 total)
  • You must be logged in to reply to this topic.