I’d like integrate Gaze data stream and Fixation data stream

Home Forums Software Development I’d like integrate Gaze data stream and Fixation data stream

Viewing 10 posts - 1 through 10 (of 10 total)
  • Author
  • #7443

    Thank you for checking this topic.

    I will try to integrate Gaze data stream and Fixation data stream to get 2 type of data at the same time.Would it be possible?

    In addition, I wish to know the Eye position data stream’s rate. (Hz%sec)
    I’m going to write a program to check and count number of blinking with this sample code.

    Developing environment
    Tobii EyeX SDK for C/C++ 1.8.498 and Eye tracker 4c

    Thank you

    Grant [Tobii]

    Hi @stratus,

    Yes, you should be able to simultaneusly obtain both the Gaze Data and Fixation Data Stream using the Tobii Core SDK which provides further enchancements over the now deprecated EyeX SDK but with an almost identical syntax.

    In addtion, we have included a number of detailed sample projects to help you get up and running @ https://github.com/Tobii/CoreSDK
    which include several methods on obtaining data streams including fixation and gaze that should be exactly what you are looking for!

    In terms of Eye Tracker Frequency, bear in mind that with the Tobii C4 Eye Tracker that Capture Rate does fluctuate around an average value of around 90Hz.

    Should your development necessitate an eye tracker with a more precise sample rate (or indeed if you intend to store gaze data for scientific use) then I would refer you to the Tobii Pro range of Trackers and software. In the meantime, check out the aforementioned Core SDK samples and please let us know how you get on.


    Thank you Grant

    These are almost complete. thanks to your help.
    I saw some Tobii pro products. It’s really attractive so we’ll consider it.
    And I ask two questions.

    1. Why coordinate gained from Gaze data stream and coordinate gained from Fixation data stream aren’t congruent? I compare same timestmp.
    X:1859.9 Y:961.8 Timestamp:1740003ms [gaze]
    X:1867.6 Y:965.6 Timestamp:1740003ms [fixation]
    X:546.4 Y:288.7 Timestamp:1673166ms [gaze]
    X:547.3 Y:289.1 Timestamp:1673166ms [fixation]
    Sometimes similar but same value is nothing

    2.My Predecessor built some systems with c++ and EyeX SDK.
    So I’m using C++ now. However New Core SDK is C#. Do you have Core SDK for C++
    Could you please tell me if you know of any way.

    Thank you

    Grant [Tobii]

    Hi @stratus, It would help in beginning to appreciate the difference between a gaze point and fixation point.

    In simple terms, a gaze point is the point on the screen where the user is looking at a given moment of time. However, because the gaze point is an intrinsically noisy signal, the GazePointDataStream uses a configured filters to stabilize the signal.

    Whilst fixations are determined as from a series of gaze points: one point representing the beginning of the fixation, a set of intermediate points during the fixation and the last point corresponding to the end of the fixation. As with the GazePointDataStream, the fixation stream will also have a unique filtered applied.

    Accordingly, whilst one might expect the location of Gaze Point and Fixation to be similar within a short time period, they will never be exactly the same.

    There is a great deal of documentation available online that goes into further detail on this topic. One such link with good depth can be found @


    Regarding the use of C++, Whilst we have plans to add C++ bindings to Core SDK Interaction API in the near future, but currently the only option is to use stream engine API (http://developer.tobii.com/tobii-core-sdk/) or the legacy EyeX SDK.


    Could please bear with me for asking so many times but i have 2 orders.
    1. I’d like to know that which program can I take more accurate coordinates.
    2. Could you please tell me that how should I change codes to integrate Gaze Stream and Fixation Stream.

    Now I’m making application for Eye tracker with sample programs to get and output fixation data and where am i stared.
    That is not changed much between orginal sample’s code. I’m comparing accuracy of Gaze Stream and Fixation Stream.
    So I need accurate coordinates when I stare one point on a screen.
    to put it simply, Which program is better one to get almost the same coordinates when i stare a point on screen?

    Perhaps If I get to know the 1st question, 2nd question will be unnecessary maybe.
    However I ask it, just in case.
    I don’t have any idea to get gaze data and fixation data in same timestamp and same project(program).

    Cord 1 can output gaze data and fixation data in same project, but timestamp is diffrent.

    Think that’s surely easy, but I don’t have enough skill to realize.
    So I would like to ask about it. “Gaze data and Fixation data with Same timestamp and project”

    Ideal output is like this “Gaze (100.1,100.2) timestamp: 123456ms Fixation data (101.4, 101.0) timestamp: 123456ms” ]
    Thank you

    Cord 1(The base is minimal fixation data stream)————————————————————————————————–

    success &= txCreateGazePointDataBehavior(hInteractor, &params) == TX_RESULT_OK; in38


    void OnGazeDataEvent (TX_HANDLE hGazeDataBehavior)

    if (txGetGazePointDataEventParams(hGazeDataBehavior, &eventParams) == TX_RESULT_OK)
    printf(“\n\nGaze: (%.1f, %.1f) timestamp %.0f ms\n”, eventParams.X, eventParams.Y, eventParams.Timestamp);

    else {
    printf(“Failed to interpret fixation data event packet.\n”);
    //———————————————↑get gaze data ——————————————————
    //———————————————–↓get fixation data ————————————————–

    Initializes g_hGlobalInteractorSnapshot with an interactor that has the Fixation Data behavior.

    void OnFixationDataEvent(TX_HANDLE hFixationDataBehavior)
    char* eventDescription;

    if (txGetFixationDataEventParams(hFixationDataBehavior, &eventParams) == TX_RESULT_OK){

    eventType = eventParams.EventType;
    eventDescription = (eventType == TX_FIXATIONDATAEVENTTYPE_DATA) ? “Data”
    : ((eventType == TX_FIXATIONDATAEVENTTYPE_END) ? “End”: “Begin”);

    printf(“Fixation [%s]: (%.1f, %.1f) timestamp %.0f ms\n”, eventDescription, eventParams.X, eventParams.Y ,eventParams.Timestamp);}

    else {printf(“Failed to interpret fixation data event packet.\n”);}

    //Callback function invoked when an event has been received from the EyeX Engine.


    txGetAsyncDataContent(hAsyncData, &hEvent);

    // NOTE. Uncomment the following line of code to view the event object. The same function can be used with any interaction object.

    if (txGetEventBehavior(hEvent, &hBehavior, TX_BEHAVIORTYPE_FIXATIONDATA) == TX_RESULT_OK)
    {OnFixationDataEvent(hBehavior); txReleaseObject(&hBehavior);}

    if (txGetEventBehavior(hEvent, &hBehavior, TX_BEHAVIORTYPE_GAZEPOINTDATA) == TX_RESULT_OK)
    {OnGazeDataEvent(hBehavior); txReleaseObject(&hBehavior);}

    // NOTE since this is a very simple application with a single interactor and a single data stream,
    // our event handling code can be very simple too. A more complex application would typically have to
    // check for multiple behaviors and route events based on interactor IDs.


    (Omitted below)

    Grant [Tobii]

    Hi @stratus,

    >> 1. I’d like to know that which program can I take more accurate coordinates.

    Fixation is not more or less accurate than gaze, they are just rather different concepts as detailed in the previous post.

    >> Which program is better one to get almost the same coordinates when I stare a point on screen?

    So, certainly if you are trying to keep a consistency in which coordinates are being reported when gazing over a similar part of the screen, then the Gaze Data Stream is what you need.

    >> “Gaze data and Fixation data with Same timestamp and project”

    The timestamps will necessarily be different due the different nature of both these data stream. You will never have both the exact same value as fixation are a an averaged values of many timestamps with gaze data taken at a specific time (although with filtering applied)

    I suspect that for your needs, you should focus only on the Gaze Data Stream,
    and with lightly filtering applied.

    However, if you need true raw precision without any filtering (which is what I gather from your post?) then it might be best for you to look at the Tobii Core SDK stream engine


    Where you get the gaze coordinates on screen without any filtering applied.

    Please bear in mind that the Core SDK licence is intended only for interactive and gaming purposes.

    Saving or transmitting gaze data in any form is considered analytical use and will violate user privacy. You can find a copy of the detailed license terms @


    Should you wish to store gaze data for analysis then it will be necessary for you to purchase the Tobii Pro SDK licence.


    Thank you Grant. Appreciate your cooperation.
    I thought that to realize same timestamp record was impossible. Just as I expected.
    Because every code and pattern ended in failure, but I’ve sought what would be a good way to do it. thank you and sorry.

    Your reply was pertinent advice,although new questions arised.
    That’s do i need the SDK licence for to output and use data under any circumstances?

    I’m using EyeTracker 4C for non commercial purpose.
    Some times to share information about where did i stare during Playing a game with friends. Also some times to use it for simple research activities in a university.
    Every activity is non commercial purpose.
    Nevertheless must I purchase the Tobii Pro SDK licence?

    If that’s the case, I’ll report this and discuss.
    Thank you

    Grant [Tobii]

    Hi @stratus, yes I am afraid that even if the application is for non-commerical purposes, if the data is to be used for analysis , especially in a academic environment, then this licence would be necessary.

    I would suggest you contact [email protected] and explain your intentions.

    Further details for the Tobii Pro SDK you can find online @ http://developer.tobii.com/tobii-sdk-guide/


    Have you guys released the c++ version of interaction API, i can’t use csharp because my algorithm is written by c++ and i don’t know how to use stream engine… Any help would be appreciated! Need help badly~

    Grant [Tobii]

    Hi @leengu, sorry but the C++ variant of the Interaction API is not yet available for public use. However, we do include a number of stream samples @ https://github.com/Tobii/stream_engine and a brief
    getting started guide @ https://tobii.github.io/stream_engine/

    However, if you are intending on using WPF, etc then the transition may indeed be quite tricky.

    If you could kindly expand on what you are trying to achieve, we may be able to figure out a workaround based on your needs. Many thanks.

Viewing 10 posts - 1 through 10 (of 10 total)
  • You must be logged in to reply to this topic.