- 11/07/2018 at 10:00 #8570Susanne TrickParticipant
I need a remote eye tracker that can be used in Linux and with Python. This should work with the Tobii eyeX and the Tobii Pro SDK (with a license). The special thing is that I need the 3D gaze vector without a screen. I think this should be possible in general because you need the vector to intersect it with the screen. But I only found that someone did this with the now deprecated GazeSDK. Is there any possibility to achieve this with current software? Has anyone experience with this kind of task?
Thanks in advance!
Susanne13/07/2018 at 12:45 #8595Grant [Tobii]Keymaster
Hi @susanne and thanks for your query. I am afraid that whilst the Tobii Tracker 4C under the Pro SDK will work with Linux, this is not applicable to the EyeX as most of the information processing in this older hardware is done via software (as opposed to onboard in the case of the 4C) installed on the PC and this was never ported (at least publically) to Linux.
Certainly, you can use the EyeX with Python and Pro SDK under Windows, but I am afraid that due to the constraints of the software used, that a “no screen” option is not something that is generally supported either with the Tech Range of Trackers. For this advanced setup purpose, we have the Tobii Pro range of trackers that are designed to allow for this.
All that being said, if you were able to work under Windows with the Device you could calibrate on screen and then simply replace the screen with an appropriately sized stimulus, but such a setup would of course not be optimal. You would however at least have the 3D Gaze Vector exposed for analysis.
If you need to use Linux and in particular a screenless setup, then a Tobii Pro Solution would be the most appropriate. I was curious to see what GazeSDK solution you were referring to? Perhaps there is some workaround that we can offer. Apologies for the disappointing news.14/07/2018 at 17:57 #8597Susanne TrickParticipant
This was the GazeSDK solution I referred to: https://developer.tobii.com/community/forums/topic/eyex-gazesdk-and-monitor-less-calibration/. If you could offer some workaround, this would be great.
What do you mean exactly with “you could calibrate on screen and then simply replace the screen with an appropriately sized stimulus, but such a setup would of course not be optimal. You would however at least have the 3D Gaze Vector exposed for analysis.”
I need a free working area in front of the human and need to track on which objects in this working area the human is fixating. So one stimulus object at the position of the calibration screen is not sufficient. I guess for this setup there is no possibility to get the 3D gaze vector with an EyeX or a 4C eye tracker then, right?16/07/2018 at 12:35 #8600Grant [Tobii]Keymaster
Hi @susanne, okay thanks for that. Yes, I can confirm that there is no longer the means to create your own custom calibration routine (as was with GazeSDK) unless you work with the Tobii Core SDK Stream Engine and purchase a special licence to do so.
What I meant previously was that a possible workaround would be to calibrate in a regular fashion on a screen and thereafter replace the screen with the stimulus such as a magazine, book, projected screen, etc.
However, now that you have explained you want to track in the Z-Direction as well, that does change the circumstances significantly. There is not setup we offer (including within the Tobii Pro of trackers) that will correctly track your gaze location in 3 Dimensions.
BTW, by 3D Gaze Vector, I assume you were referring to the Gaze Position (x,y,z) within the User Coordinate System as described @ http://developer.tobiipro.com/commonconcepts/coordinatesystems.html
For example, if you had an object close and an object far but both in the same line of sight, there would not be a way to determine which you were looking at, even with the 3D Gaze Vector. Such tasks are possible although only with head mounted or remote eye trackers such as the Tobii Glasses (https://www.tobiipro.com/product-listing/tobii-pro-glasses-2/)
With all other trackers including those from Tobii Tech (EyeX, 4C) & Tobii Pro (X60, T120, etc) the final gaze *location* is only ever given within a 2D Plane as initially calibrated.
All that being said, if your environmental setup is such that object are spaced reasonably far apart without not much overlap in the Z-Dimension, then you may be able to extract very approximate values for where the user looks within that space. If per chance there are any photos of your setup, that would be useful to see. Thanks.
- You must be logged in to reply to this topic.