- 07/09/2014 at 21:33 #1600
We are experimenting for using the EyeX for research purposes. Yes, other trackers would be more suited to the task, but the cost of EyeX would make it perfect for acquiring lots of them and having a common platform among our research partners. Also, we do have glass-trackers but for many reasons are used only in very few studies.
In any case, here is what we would like ultimately to achieve:
A person is in front of a table, but there is no monitor in front of him. Taking a hint from the MinimalCalibration app in GazeSDK, could we theoretically have a thick, not flexible cardboard with 9 points, positioned firmly where the monitor is supposed to be (aligned with the EyeX’s markers), and manually in a console application confirm that the person is looking at calibration point x?
Then, after the calibration, the cardboard can be removed and the gaze tracked (at least for the plane where the monitor was supposed to be).
The accuracy we want to achieve is very low. We are more interested in detecting gaze in quite broad regions (in tests with the monitor being moved a bit during tracking, i.e., less than perfect tracker installation, the accuracy remained well within our margins).
Is the above approach possible?09/09/2014 at 09:18 #1622
Yes, that approach should be possible. I guess the hard part is to synchronize the calibration process, so that the user looks at the correct place at the correct time.
Once you have succeeded with the calibration and removed the calibration cardboard, you could in addition to the 2D gaze point on the calibration plane also use the 3D gaze vector (between the 3D eye position and 3D gaze position) to track if the user is looking at objects in front of or behind the calibration plane. It would require to keep good track of your coordinate systems, but I think it can be worth it.09/09/2014 at 09:41 #1623
I was thinking to solve the problem of synchronizing the calibration by adding an extra step. Instead of the program stopping for only two seconds or so (XSLEEP(2000);), the program would pause, asking for input in order to proceed to the next point.
I read elsewhere in the forum that is possible to mix EyeXSDK and GazeSDK for .NET. I assume it is also possible for C/C++?09/09/2014 at 11:42 #1624
Asking for input before each calibration point sounds like a good idea.
Regarding mixing of EyeX SDK and Gaze SDK: It is technically possible, but might be a bit complicated. Since you require custom calibration, are not depending of on-screen interaction and have an interest to port your software to Linux (if/when you have an eye tracker that runs on Linux), I would recommend to just use the Gaze SDK.
Read more about
Differences between Tobii Gaze SDK and Tobii EyeX SDK09/09/2014 at 12:04 #1625
So, just to clarify some points that I am confused about:
If I am using the EyeX tracker and I want to use exclusively the GazeSDK for my application, do I (in Windows):
a) Do I have to have the EyeX engine running first for my GazeSDK application to connect to the tracker or not (the drivers are of course installed)?
b) Using the GazeSDK API to initialize the EyeX tracker will work? (tobiigaze vs eyex namespace)09/09/2014 at 14:09 #1626
You only need to run the EyeX Engine once to install the drivers and configure the display settings. Thereafter, you can use the tobiigaze_get_connected_eye_tracker to connect to the eye tracker without using the EyeX Engine. Please refer to the C/C++ samples in the Gaze SDK package.
- You must be logged in to reply to this topic.