This was the GazeSDK solution I referred to: https://developer.tobii.com/community/forums/topic/eyex-gazesdk-and-monitor-less-calibration/. If you could offer some workaround, this would be great.
What do you mean exactly with “you could calibrate on screen and then simply replace the screen with an appropriately sized stimulus, but such a setup would of course not be optimal. You would however at least have the 3D Gaze Vector exposed for analysis.”
I need a free working area in front of the human and need to track on which objects in this working area the human is fixating. So one stimulus object at the position of the calibration screen is not sufficient. I guess for this setup there is no possibility to get the 3D gaze vector with an EyeX or a 4C eye tracker then, right?