Home › Forums › Software Development › Kiosk application – how to handle eye tracker calibration? › Reply To: Kiosk application – how to handle eye tracker calibration?
lots of questions there 🙂 — starting from the top:
Q. Is it sensible to use the same calibration for all users?
A. It depends on the level of accuracy you need. If you only need a rough indication of where the user is looking, then you might be able to do without a calibration. The lack of proper calibration will show up as an offset error in the gaze point of up to several centimeters — which doesn’t matter at all in some applications but ruins everything in other cases. My recommendation would be to give it a try and see if you can do without, especially since the users will only spend a couple of minutes at the kiosk station.
Q. Is it possible to calibrate the device behind-the-scenes?
A. Yes. The low-level Gaze SDK provides an API for calibrating the device. It works roughly like this: you attract the user’s eye gaze to a point on the screen and tell the tracker to record a calibration point at the same time. Repeat for as many points as is needed. (This is of course a simplified description. In real life it’s a bit tricky to get it right, because it’s all asynchronous and you will need some error handling and so on.)
Q. Is there a simple way of launching the calibration process from within my application?
A. If you use the low-level Gaze SDK, then the answer is definitely yes. If you use the EyeX SDK then the answer is no, not at the moment, but we are considering that as a future extension.