Home Forums Eye Tracking Devices Temporal filtering

Viewing 6 posts - 1 through 6 (of 6 total)
  • Author
  • #7970

    Hi, I read that for Eye Tracker 4C the “Gaze data is lightly filtered, e.g. temporal filtering.”.

    Can you please explain what is the “temporal filtering” and what other filters are used to filter gaze data on 4C?


    Grant [Tobii]

    Hi @torrero007, “Lightly filtered” (i.e. the default) is an adaptive filter which is weighted based on the age of the gaze data points GazePointData and the velocity of the eye movements. This filter is designed to remove noise and at the same time being responsive to quick eye movements, so hopefully this option should not pose an issue for your application in terms of responsiveness.

    If you are developing using the Tobii Core SDK and should you find any delay a hindrance you may select “Unfiltered” where no filtering is performed by the Interaction Engine (except for the removal of invalid data points and the averaging of the gaze points from both eyes).

    Selecting this option however is not without it’s drawback as the gaze point is an intrinsically noisy signal, as so the filters are used to stabilize the signal. There is therefore a trade-off between stability and responsiveness, so I would recommend you test out your application with both options and decide which suits you best for your own particular needs.


    Hi Grant and Vasilis, thanks for this post.

    I have recorded data with the Tobii 4C as well (using a pro license), and would like to read more about all “pre-processing” that has been automatically applied to the raw data.

    Steps such as the mentioned filtering, invalid data point removal, and averaging across eyes.

    Where can I obtain that information?

    Also: Where can I find more information on how to update the settings of my Tobii 4C to not do some of these steps?

    Grant [Tobii]

    Hi @sappelhoff, you should be able to find a number of previously created scripts produced by others users at the Tobii App Market


    These scripts were generally tailored for use with the Tobii Pro SDK, but I believe the methodology (and perhaps code) should still work with the Tobii Tech range of SDKs, namely the Tobii Stream Engine API and Interaction Library API.

    Please let us know if we can provide any further information. Best Wishes.


    Hi Grant,

    thank you. I think my question was a bit unclear: I have already recorded my eyetracking data using the pro license, and the Tobii 4C … by making use of the Python API as part of the Tobii SDK.

    My question is: What processing steps have been applied to the data in the time between raw collection in the eyetracker, and saving the data to disk?

    Based on the original post in this forum thread, I assume that “temporal filtering” has been applied. And in the follow up post, you mention another processing step. I would like to learn more about these steps – can you provide information / documentation on this, please?

    Before analyzing my data, I need to know what has “already been done with the data”.

    Grant [Tobii]

    Hi @sappelhoff, I am afraid we are unable to divulge the specific algorithms in place by our eye trackers between the image being captured and raw gaze data returned to user. However, the raw data as supplied in the SDK is given with a corresponding time stamp and no filters per se have been applied.

    Perhaps of interest to you is this paper which was written for the EyeX but should still provide a reasonable description for your own needs:


Viewing 6 posts - 1 through 6 (of 6 total)
  • You must be logged in to reply to this topic.