Home › Forums › Software Development › [Solved] Acquiring Current Timestamp
Tagged: accuracy, c++, eyex, timestamp, Visual Studio
- This topic has 13 replies, 4 voices, and was last updated 6 years, 5 months ago by Grant [Tobii].
- AuthorPosts
- 04/02/2015 at 11:19 #2469tobdevParticipant
I am using the Tobii EyeX API to develop my own methods for filtering the gaze data.
I have built a function that calculates a new gaze point from an array of recent gaze points captured by the EyeX.
The function returns a TX_GAZEPOINTDATAEVENTPARAMS struct containing the gaze point that I created. How is the Timestamp acquired by the EyeX API?
I want to timestamp my new gaze point using the same format.
04/02/2015 at 14:55 #2470Jenny [Tobii]ParticipantHi tobdev,
The timestamp is in microseconds relative to an arbritary point in time.
You could use the timestamp of the most recent data point that you used to calculate the new data point, or do some sort of averaging over the timestamps of several points used to calculate the new point. What is reasonable depends on how your filter algorithm works.
05/02/2015 at 11:38 #2473tobdevParticipantIs there a function built into the EyeX API that returns the current timestamp that I can use?
06/02/2015 at 09:27 #2479Jenny [Tobii]ParticipantHi tobdev,
No, there is no such function in the EyeX API. The timestamps in the events are designed to be used for comparison between events. This is all that should be needed for a gaze interaction use case, which is the intended use of the EyeX API.
07/06/2018 at 19:59 #8431vwgx3ParticipantHello,
sorry to resurrect an older post, but I think my question best fits here, and I hope this is o.k.?I’m still not quite sure whether I fully understood how time stamp precision in TOBII Eyex gaze data has to be considered:
1) Accuracy
According to the “Timing Guide for Tobii Eye Trackers and Eye Tracking Software”
the variance of the timestamp generated in the eye tracker at the middle of the capturing period is primarily linked
to how well the experiment setup complies with the optimal conditions the eye tracker is designed for to deliver optimal performance.
In the worst of experiment setup conditions, it may be up to +/-3ms.
But that is an issue distinct from the aspect of “latency” which solely affects the availability of the gaze data for the subscribing application once it is recorded by the scanner.Therefore the timestamp contained in the data frame still describes the recording time accurately (within that range of max. +/-3ms
as cited in the “Timing Guide for Tobii Eye Trackers and Eye Tracking Software”), unaffected by these latencies?Latency then primarily affects the availability of recorded data?
Put differently: as soon as one obtains the gaze data frame in the client application, one still can determine the point in time of it’s recording with the cited accuracy and in reference to the starting point of the time stamp count up (as arbitrarily initiated by TOBII software stack internals) by using it’s bundled time stamp?2) Measurement unit
As [ms] often refers to “milliseconds”, i think here and in the context of that whitepaper “Timing Guide for Tobii Eye Trackers and Eye Tracking Software”,
still refers to “microseconds”, as often indicated, and this unit [ms] actually means “micro”?I am trying to understand this right as I want to draw conclusions for subsequent data frames in the situation when frames/samples have been dropped,
which depends on the time stamp information bundled in their frames representing their true recording time accurately.If someone could kindly confirm or correct my assumptions, that would be great!
07/06/2018 at 22:34 #8432Grant [Tobii]KeymasterHi @vwgx3, thanks for your query. The whitepaper you refer to “Timing Guide for Tobii Eye Trackers and Eye Tracking Software” was written for the Tobii Pro range of eye trackers which operate under rather different firmware and hardware configurations distict from the Tobii Tech Range of Eye Trackers (EyeX,4C) which may have a variable frame rate response and so the values contained within the whitepaper or not necessarily applicable.
Indeed, the Tech range of Trackers are primarily designed for interaction and gaming purposes, not scientific research or analysis.. and in fact using the trackers for this purpose requires the purchase of a special licence to do so. Accordingly, no such whitepaper on timing is produced for this range of eye trackers.
The best thing would if you could kindly describe your project setup including all Tobii hardware and intended software stack and I will be happy to try to find you the most appropriate solution for your needs. Thanks.
08/06/2018 at 10:48 #8433vwgx3ParticipantHi Grant,
thank you for your quick reply and offer to help.
I use an TOBII EyeX as part of my final degree’s project’s equipment, and actually it serves my requirements quite well:
I need to acquire raw gaze data I can scrutinize for specific reactions of the user in real-/near time.
The missing data token now is whether the user switches gaze velocity, and this is where the time information is required.
Jenny already provided an excellent proposal for handling the time stamp above, but I just wanted to double check, whether the time stamp-data-mapping truly is constant over time.
I.o.w.: does the time stamp always refer to the same point in time of the device internal data acquisition procedure and therefore would be a feasible criterion to measure gaze velocity?
Since variances are technically unavoidable, it would also be interesting to learn how big the variances can typically get for an EyeX (i.e.: just the variances of the data acquisition/frame capturing, leaving aside all the latencies arising in the subsequent process chain).
That would be a big help.
Thank you.08/06/2018 at 13:38 #8434Grant [Tobii]KeymasterHi @vwgx3, I will endevour to find for you the relevant information for your query, but in the meantime I should point out the long term storge of gaze data for analytical purposes is expressly prohibited with the EyeX and Associated SDK without a special licence as stated @
Accordingly, it would be great if you could confirm if you have already purchased this licence and the specifics of your project. In any event, I will still try to find the answers to your question if possible, although these metrics for the EyeX may not be available publically, thanks for your patience.
08/06/2018 at 14:54 #8435vwgx3ParticipantHi Grant,
thank you again.
I read the licensing terms onand think i do comply with it as i use the eye tracker for implementing a complementary non-tactile interaction component within a man-machine contol UI.
Data is not stored or analysed beyond the real-/neartime scanning for generating triggers to implement user decisions in a machine control UI.
If the time information can be implemented as planned/anticipated/understood so far, the logic would require to store just the previous recorded frame for the span of computing a difference, while all ohers/older are disposed immediately. Everything beyond would be too slow.The user’s behaviour itself is not subject of my assignment.
I hope that this is still compliant?09/06/2018 at 15:10 #8436Grant [Tobii]KeymasterHi @vwgx3, Initially, it seems your project specs are OK but I will need to confirm that internally before proceeding. Thanks for your patience whilst I try to get you the timing information.
11/06/2018 at 10:03 #8439vwgx3ParticipantHi Grant,
that would be gorgeous, thank you!
Best11/06/2018 at 18:28 #8440Grant [Tobii]KeymasterHi @vwgx3, I can confirm that the timestamp is accurate to within a few ms for this purpose.
It corresponds to the point in time when the image has been read from the image sensor. Whilst there is some variability according to camera specific settings the specifics of this are not public domain for the Tobii Tech range of Eye Trackers. We do not output corrections for this potential variability for Tech trackers, however it is mainly the same point in time but up to a few ms in variability.
Tobii Pro trackers are designed and documented for use cases where lower variability and extensively documented timing values are available.
So ultimately, yes, you can depend that the timestamp received is on you can judge is accurate to the moment at which the gaze is taken especially for interaction purposes.
Regarding licencing, I am waiting to hear back, but hopefully this information is of some use to you.
12/06/2018 at 12:32 #8448vwgx3ParticipantHi Grant,
thank you very much for sharing your infos about the time stamp question up front!
That’s good news already.
If the other aspect also works out that well, I’d have all I need to finish my work 😉Thank you again,
best regards13/06/2018 at 18:03 #8453Grant [Tobii]KeymasterGreat, happy to help! Let us know if we can be of any further assistance 🙂
- AuthorPosts
- You must be logged in to reply to this topic.