- 14/10/2014 at 09:53 #1838
In the Gaze SDK, what is the reference point of the timestamp provided via the tobiigaze_gaze_data ?
How can I set it to a custom timestamp? I noticed that there are relevant function prototypes for callbacks to retrieve the timestamp and timesync, but there is no information at all about this.
An short example would be very much appreciated.
thank you!15/10/2014 at 14:17 #1854Jenny [Tobii]Participant
The timestamp reference point can be considered an arbitrary point in time. There is no way to set it to a custom timestamp. What you would need to do is to create some small utility function for handling this yourself.
For example: save the first received timestamp and at the same time save a reference timestamp based on your desired time measuring mechanism. For every new gaze data received, you can then subtract the first gaze data timestamp from the current gaze data timestamp and know how many microseconds has passed from your reference timestamp.
The timestamp and timesync function prototypes are there for some specific experimental purposes and there is no plan to document or examplify the usage.17/10/2014 at 10:52 #1860
Thank you for the answer. I have implemented this, but there is unknown latency and it is unclear if the gaze points processing happens in a FIFO manner, or you may be doing parallel processing, samples are dropped to achieve a desired frequency etc.
The timestamp of a gaze point, I understand is obtained at the time the eye image is taken, it is then processed for an undefined amount of time until the point is available for use in the SDK. When the end-user (my) application receives the gaze point via the relevant callback is also undefined in a way, because it is not clear when and under what circumstances the callback event is fired.
I measured the amount of temporal difference consecutive gaze samples have by using two different timestamps:
(a) the one provides by the eye-tracker
(b) the local timestamp of the host when the callback function was executed
The results are inconsistent. For instance the reported time difference based on the eye-tracker timestamps is around 15ms, with a stdev of up to 2ms, while for the same gaze sample the callback is executed arbitrarily. For example sometimes the callback is fired for consecutive sample after 4ms, others can be 10ms, or even 40ms. What’s more, the reported timestamps have a very regular outlier every 10 gaze samples it appears a sample is probably dropped from the pipeline, since the time diff to the next sample is around 30ms (twice as much as the others). Look at my list at the end of the post, on lines: 1, 10, 19, 28, 37, 46. I took these measurements on the EyeX. I can try it on an X120, but not anytime soon unfortunately.
So my intuition is that if I obtain the timestamp of the host at the time I receive the first gaze sample, that will most likely cannot by reliably assumed to also be the actual time that this gaze sample was timestamped.
I still think that the SDK should allow setting a user defined timestamp in a very fast, low latency function before it starts tracking and all timestamps of the gaze samples computed are adjusted to take into account latency. Of course there is time drift over time etc., but at least timestamps will be much more accurate at the beginning.
Of course I may have got it all wrong, so please let me know how I could do this better. Thanks a lot.
here is one run of my test, described above (time diffs are shown in microseconds – my hosts clock measurements have a resolution of 1ms):
1) [samples: 0,1] [status: 1] host: 25000 eyetracker: 31599
2) [samples: 1,2] [status: 1] host: 19000 eyetracker: 14493
3) [samples: 2,3] [status: 1] host: 15000 eyetracker: 15104
4) [samples: 3,4] [status: 1] host: 12000 eyetracker: 14320
5) [samples: 4,5] [status: 1] host: 16000 eyetracker: 15590
6) [samples: 5,6] [status: 1] host: 15000 eyetracker: 15056
7) [samples: 6,7] [status: 1] host: 15000 eyetracker: 14337
8) [samples: 7,8] [status: 1] host: 12000 eyetracker: 15590
9) [samples: 8,9] [status: 1] host: 28000 eyetracker: 15002
10) [samples: 9,10] [status: 1] host: 18000 eyetracker: 31070
11) [samples: 10,11] [status: 6] host: 19000 eyetracker: 13925
12) [samples: 11,12] [status: 6] host: 13000 eyetracker: 14408
13) [samples: 12,13] [status: 6] host: 12000 eyetracker: 15604
14) [samples: 13,14] [status: 6] host: 26000 eyetracker: 14983
15) [samples: 14,15] [status: 6] host: 4000 eyetracker: 14411
16) [samples: 15,16] [status: 6] host: 15000 eyetracker: 15604
17) [samples: 16,17] [status: 6] host: 19000 eyetracker: 14988
18) [samples: 17,18] [status: 6] host: 11000 eyetracker: 14436
19) [samples: 18,19] [status: 1] host: 39000 eyetracker: 31636
20) [samples: 19,20] [status: 1] host: 13000 eyetracker: 13933
21) [samples: 20,21] [status: 1] host: 8000 eyetracker: 14997
22) [samples: 21,22] [status: 1] host: 17000 eyetracker: 14994
23) [samples: 22,23] [status: 1] host: 13000 eyetracker: 15013
24) [samples: 23,24] [status: 1] host: 38000 eyetracker: 14979
25) [samples: 24,25] [status: 1] host: 2000 eyetracker: 15003
26) [samples: 25,26] [status: 1] host: 61000 eyetracker: 15093
27) [samples: 26,27] [status: 1] host: 4000 eyetracker: 15058
28) [samples: 27,28] [status: 1] host: 2000 eyetracker: 30949
29) [samples: 28,29] [status: 1] host: 1000 eyetracker: 13896
30) [samples: 29,30] [status: 1] host: 13000 eyetracker: 14997
31) [samples: 30,31] [status: 1] host: 30000 eyetracker: 15019
32) [samples: 31,32] [status: 1] host: 2000 eyetracker: 14974
33) [samples: 32,33] [status: 1] host: 12000 eyetracker: 15048
34) [samples: 33,34] [status: 6] host: 15000 eyetracker: 14966
35) [samples: 34,35] [status: 1] host: 18000 eyetracker: 14988
36) [samples: 35,36] [status: 1] host: 12000 eyetracker: 15016
37) [samples: 36,37] [status: 1] host: 32000 eyetracker: 31072
38) [samples: 37,38] [status: 1] host: 13000 eyetracker: 13942
39) [samples: 38,39] [status: 1] host: 16000 eyetracker: 14970
40) [samples: 39,40] [status: 1] host: 15000 eyetracker: 14997
41) [samples: 40,41] [status: 1] host: 14000 eyetracker: 15035
42) [samples: 41,42] [status: 1] host: 38000 eyetracker: 14965
43) [samples: 42,43] [status: 1] host: 1000 eyetracker: 15008
44) [samples: 43,44] [status: 1] host: 6000 eyetracker: 15000
45) [samples: 44,45] [status: 1] host: 17000 eyetracker: 15002
46) [samples: 45,46] [status: 1] host: 37000 eyetracker: 31087
47) [samples: 46,47] [status: 1] host: 8000 eyetracker: 13910
48) [samples: 47,48] [status: 1] host: 34000 eyetracker: 15065
49) [samples: 48,49] [status: 1] host: 2000 eyetracker: 1492020/10/2014 at 08:13 #1865Robert [Tobii]Participant
Your observations are correct. But as Jenny said, the Gaze SDK time sync functionality and the EyeX Controller firmware is experimental and not adapted for the kinds of applications that require careful time sync.
If you want to have full control of the clock synchronization, you should take a look at Tobii’s product lines for research and analysis. You already mentioned that you have access to an X120. If you use that together with the Tobii Analytics SDK 3.0, you should have the control you need.
The Tobii EyeX product lines are primarily meant for games and consumer applications. We have not found any use case yet where this kind of careful time sync is required. But if we find that many of these applications require time sync, then we’ll have to add it.20/10/2014 at 09:31 #1869
Hi Robert and Jenny,
Thank you for your replies. I have been using Tobii products and the other SDKs for quite a few years. My understanding was that the Tobii Gaze SDK is the sdk that is at the lowest level for controlling the devices, so I decided to port my pipeline and do my own timesync, which will work across different tobii eye-trackers.
I am merely asking for a function to obtain important data from the devices via the low-level sdk, that will allow me to develop my own time sync (however successful that may be), and I think this is a reasonable thing to be able to do with a low-level api. The Analytics SDK, it is my understanding, that it does not work with the EyeX Controller, so effectively I am stuck with a device that I have no means to sync better.
I have pondered this for a while and I can think of no technical reason that the Tobii Gaze SDK cannot have a couple of functions to set or get timestamps from the eye-tracker. For instance, a possible solution may involve setting a timestamp from the user application (and be able to read it from the eye-tracker), as well as annotate gaze data sample with two timestamps, one timestamp at the time it was sent into the processing pipeline and the exact time it exited it. In my opinion this will be better than having no such functionality.
I hope that you will at least consider adding some kind of functionality to the Gaze SDK.
- You must be logged in to reply to this topic.