Home Forums Eye Tracking Devices fixation jitter

Viewing 9 posts - 1 through 9 (of 9 total)
  • Author
    Posts
  • #8816
    vwgx3
    Participant

    Hello,
    my knowledge about the physiological background of gaze modes is still limited, thus I’m surprised to discover that fixations or smooth pursuit eye movements still exhibit some noticeable “jitter” in the measurements, or differently put: micro-oscillations. Amplitudes magnitudes of fixation jitter often do not differ from those of well executed smooth pursuit eye movements in my observations…
    I am using a TOBII EyeX, thus am aware of it’s limitations in terms of accuracy and constancy of sampling rate. Now I wonder whether these oscillations may result from device internal methods how user’s eye points are quantized, or whether these measurements truthfully reflect physiological necessities? The non-uniformity of the sampling rate is s.th. I thought to have taken care of by normalizing position coordinates with time stamp differences – actually…
    Can some of you help?
    Best

    #8828
    Grant [Tobii]
    Keymaster

    Hi @vwgx3, thank you for your query. It is important in considering the various terminologies at use here to avoid confusion.

    It is quite true that raw (or even lightly filtered) gaze data as received from the eye tracker whether it be EyeX or a more advanced model will always have an inherent jitterness of position location across time. This is in fact a natural consequence of how the human eye works and so is not a technological deficiency but rather an unavoidable biological characteristic.

    A fixation however (as opposed to simple gaze location data) is quite different, as this is defined as a location of sustained attention over time. The calculation of a fixation is in fact rather dynamic and can have numerous interpretations and methods but suffice to say that a fixation is calculated using an algorithm ( or fixation filter) based upon the raw gaze data.

    Further details on the fixation filter Tobii use in some of our software can be found @ https://www.tobiipro.com/siteassets/tobii-pro/learn-and-support/analyze/how-do-we-classify-eye-movements/tobii-pro-i-vt-fixation-filter.pdf

    Along with further details on fixation concepts @ https://www.tobiipro.com/learn-and-support/learn/eye-tracking-essentials/types-of-eye-movements/

    The resultant fixation therefore is not something that is subject to jitter in the standard sense but is well defined within a window of space and time, so I am therefore somewhat confused about what you mean by having jitter in your fixations.

    It would therefore be helpful if you could kindly describe the calculations you are using to determine fixations based on data as received from your EyeX.

    In any event, I hope the information above has proven useful in understanding the distinction between simple gaze data and fixation mode. Please let us know how we can be of further assistance and if you could also include details of your current project that would be appreciated.

    #8830
    vwgx3
    Participant

    Hello Grant,

    thank you very much for your detailed answer, and the additional interesting background information.

    My concept of fixations in the first place was/is: a user”staring” intentionally at a specific point with the effort to not deviate from this point as good as possible.

    The resulting position coordinates than build a curve I described as “jittery”, because it is not the perfectly flat line, which it can’t be, due to the eye motoric’s dynamic nature (micro saccades).

    The info you hinted me describes that very well and so far confirms my understanding.

    My measurements yield a STDEV of e.g. 6, or 1,2% for the amplitude in relation to the absolute coordinate origin, and that would be a quality that one can perfectly make use of. Esp. if these position data are referenced to using a representative characteristic value, such as MEAN.

    That’s all fine.

    Problematic for me is that the amplitudes of the fixations are almost indistinguishable from those to measure for smooth pursuits eye movements at low rates. Which is not what I had expected and which I have to investigate further.

    And secondly, and the issue that led me to scrutinizing the constancy of position coordinates in fixation mode in the first place was s.th. else which now turned out to be an issue with time stamps actually:

    If I calculate the derivatives of the positions (velocities, accelerations) hefty peaks show up, randomly, and not rarely.
    I found this is linked to the non equidistant timestamps (caused by the non-uniform sampling rate which is a trait by design of the EyeX, as I understand, which also is expected).

    Those peaks are rooted in very low time stamp deltas (very much lower then the neighbouring time stamp deltas), which then account for peaks with an amplitude of multiple 10.000 (!) of the base value when calculating derivatives.

    The problem for me is: this happens randomly, and peaks are not perfectly discrete (i.e. single values in a fashion of outliers, which would ease filtering them out using appropriate filters), they rather still build up, just in a very brief amount of time.

    Since they occur for fixations and smooth pursuit eye movements, I effectively don’t have a a truly constant or stationary signal, which therefore is not usable for what I have to accomplish: using constant/static gaze states as event triggers reliably.

    I now wonder, whether those super low time stamp deltas are due to the tracker resetting s.th., thus part of it’s operation behavious?
    Or may the device have simply hardware issues?

    My understanding so far was: the sampling rate is adapted to the current gaze input, and fast gaze state changes may cause the tracker’s firmware to drop the frame rate in order for the tracker to keep up with the input (or vice versa, depending on other goals).

    But that is not in line with what I observe in my situation, as these super short time stamp deltas show up in smooth pursuit eye movement and even fixation phases, and not during saccades…

    Best

    #8832
    vwgx3
    Participant

    Just forgot to ask about the third option when speculating about the reasons of my observations: maybe the tracker is simply not made for generating data output suitable for velocity measurements, i.o.w.: providing data at constant sampling rate, but has it’s focus on providing fixation data at high velocities with good precision?

    #8836
    Grant [Tobii]
    Keymaster

    Hi @vwgx3, you are quite correct that the concept of a fixation is the maintaining of the visual gaze on a single location, however there is not necessarily an intention not to deviate… it should be seen as something natural that occurs when viewing certain stimuli. For example, an interesting logo that catches your eyes for a few seconds would likely be determined as a fixation.

    However the actual *calculation* of what constitutes a fixation based upon raw gaze data is non-trivial and something that should be considered before moving forward. Put in simple terms; input would be the inherent jittery raw gaze data which is then fed into a fixation filter algorithm which would ideally produce a series of stable fixation points with a corresponding time duration.

    Accordingly, the specific algorithm you use is of importance so It would be most useful if you could explain how you are calculating fixations along with which SDK API you are currently using with the EyeX.

    For your interest there are a number of free fixation filters which perhaps you can translate for your purposes:
    https://se.mathworks.com/matlabcentral/fileexchange/56236-djangraw-gazevistoolbox
    https://github.com/davebraze/FDBeye/wiki/Researcher-Contributed-Eye-Tracking-Tools
    https://www.tobiipro.com/about/partners-resellers/app-market/fixation-detector/

    I also worry about you using the raw positions over time and differentiating.. I assume to determine saccadic velocity? This is not a valid technique I am afraid… one needs to get the distance between two properly established fixation points and divide by the time difference to obtain saccadic velocity.

    You are quite right that the variable sample rate will be causing issues should you need exact velocities, and indeed the EyeX is not designed for detailed analysis but simple interaction purposes, so that is something else to bear in mind. I found for you an article published by a third party research group concerning the EyeX which may be of use to you that discusses the consequences of the variation @ https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5429393/

    In any event, please be aware that any analysis of data undertaken using the EyeX and associated SDK’s (Unity, Core SDK, etc) in expressly prohibited without the purchase of a special analytical licence. More information regarding this can be found @ https://analyticaluse.tobii.com/

    If you could kindly describe your intentions with the EyeX that would be helpful.

    #8838
    vwgx3
    Participant

    Hello Grant,

    thank you again for your detailed information and patience.

    I just mentioned “intention” in an attempt to better describe that I attempted to optimize the execution of the fixation as best as what a viewer can perform and can contribute.

    Actually all I do is very simple: fixation right in the sense as it’s used in gaze interaction for activating specific modes, and as a differentiation against smooth pursuit eye movements and saccades.
    So no elaborate algorithms beyond scanning the data stream for maximum position deltas, as absolute values as well as in relation to time stamp deltas.

    What i need is
    1) just the signal that is triggered when the user “fixates” a spot (effectively: not moving the gaze), which i want to measure by an assumed *very* low average gaze velocity in coincidence with a *very* limited variation of current gaze coordinates
    2) and i need to determine, whether the user’s gaze is executed “at about the same speed” as the visual stimulus has.

    Unfortunately both fail:
    1) At low SPEM velocities the measured velocities are “not so much different” from what I measure for the fixations, so state transitions are hard to detect.
    2) The discrete velocities of the fixations and the SPEM data exhibit random more or less frequent occurrences of peaks with almost absurd amplitudes which are hundreds or even thousands of percentage beyond what the average is of the measured signal.

    As the sampling rate is non-uniform, calculations of velocities are obviously more a guessing game, i am well aware of this.
    But if the signal would just exhibit that degree of noise I described as “jitter” at the beginning (which is surely a result of the combination of the eye’s physiological micro oscillations and the fact of the non uniform time deltas), that guess would perfectly fit my needs.

    Unfortunately, this tolerable noise is superposed by those peaks, result of super low timestamp deltas every now and then. Very high amplitudes. but not easily filtered out by typical filters used for removing outliers.
    I tried every commonly used filter type, and all their permutations of coefficient settings, but the result is not good enough to have it generating reliable triggers.

    What I now wonder and currently seems to be the crucial point (according to my current understanding):
    what causes the eye tracker to raise the sampling frequency so substantially (because that seems to me to be the explanation for the brief timestamp delta), if the gaze situation actually is as stationary as it practically and physiologically can get?

    – Is it because of the device internal data processing requirements (e.g. cache lines that need to be flushed, or s.th. similar),
    – or may it be an electronical issue,
    – or is there s.th. that I simply did not realise/recognize and that would have to be taken into account?

    As for the license: I am aware of TOBII’s licensing terms and believe to comply with them, as i don’t store the data even in temporary arrays.
    All the data is scanned on the fly.
    Just scanning the data stream for triggers…

    If no one else has similar experiences, I would be left to have to consider, that this issue is simply an irregularity which can’t be tackled, and eventually is not relevant to deeper investigate, as it does not reflect any fundamental concept one would have to understand and consider to understand and cure the phenomenon.

    But maybe…

    Best regards

    #8844
    Grant [Tobii]
    Keymaster

    Hi @vwgx3, thanks for the additional information regarding your project intentions. Indeed, I would agree you are quite within the licence agreement so no worries on that point 🙂

    So, my overall sense with respect to your ongoing issue is that the method by which you are determining a fixation should be looked at again.

    You had not mentioned which SDK you are using, but assuming that you are working with the Tobii Core SDK Interaction API, then it may be of better use to use to use the inbuilt fixation stream .. had you already checked this out?

    To make things easier, here is a link to the sample applications for the Core SDK @ https://github.com/Tobii/CoreSDK/tree/master/samples

    If you goto ‘streams’ then ‘Interaction_Streams_101’ this provides a simple interface to the fixation data the Core SDK can provide…hopefully it should suffice for your needs. Please let me know your thoughts.

    Whilst the delta-position, delta-time route may seem reasonable, in combination with the variable capture rate of the EyeX, I am afraid this is not a sufficiently reliably means of determining a fixation as you have already discovered..hence I would strongly recommend using one of the aforementioned fixation filter algorithms above if the Core SDK fixation stream does not serve.

    With regard to the EyeX’s variable capture rate itself, this is in fact due to the hardware and firmware used in producing this consumer level eye tracker. For studies or projects that require a more constant capture rate, we have the Tobii Pro range of trackers which as a consequence are rather more expensive due to the increased sophistication of the internal components.

    Hopefully we can get your project and up running soon, thanks for your patience.

    #8850
    vwgx3
    Participant

    Hi Grant,

    thank you for your answer and estimation.
    It’s always good to learn from the manufacturer’s perspective which specification constraints must be considered.
    As you guessed right, I use the CoreSDK, and used the unfiltered and lightly filtered streams in comparison.
    I definitely check the fixation stream.
    At this point I’m not quite sure, whether it also would work in the same connection session for smooth movements, but will certainly learn.
    As for the varying sample rate, my assumption was it may be sufficiently uniform in phases when gaze kinematics would be sufficiently stationary, as my understanding is that the tracker switches rates to better suit the current situation.
    As all of this is a proof of concept in the end, this outcome is at least a sound exploration, where specific tracker characteristics have it’s limits.
    Thank you very much for your help!

    Best

    #8853
    Grant [Tobii]
    Keymaster

    Hi @vwgx3, you are quite welcome! Glad to be of help 🙂

    Yes, please try out the fixation streams and report back the results. Please also let me know if there is any other information I can provide for you. You can easily use both fixation stream and filtered gaze stream together in the same application so you may be able to adapt according to your needs.

Viewing 9 posts - 1 through 9 (of 9 total)
  • You must be logged in to reply to this topic.