Home Forums Software Development I’d like integrate Gaze data stream and Fixation data stream Reply To: I’d like integrate Gaze data stream and Fixation data stream

#7515
poyo
Participant

Could please bear with me for asking so many times but i have 2 orders.
1. I’d like to know that which program can I take more accurate coordinates.
2. Could you please tell me that how should I change codes to integrate Gaze Stream and Fixation Stream.

Now I’m making application for Eye tracker with sample programs to get and output fixation data and where am i stared.
That is not changed much between orginal sample’s code. I’m comparing accuracy of Gaze Stream and Fixation Stream.
So I need accurate coordinates when I stare one point on a screen.
to put it simply, Which program is better one to get almost the same coordinates when i stare a point on screen?

Perhaps If I get to know the 1st question, 2nd question will be unnecessary maybe.
However I ask it, just in case.
I don’t have any idea to get gaze data and fixation data in same timestamp and same project(program).

Cord 1 can output gaze data and fixation data in same project, but timestamp is diffrent.

Think that’s surely easy, but I don’t have enough skill to realize.
So I would like to ask about it. “Gaze data and Fixation data with Same timestamp and project”

Ideal output is like this “Gaze (100.1,100.2) timestamp: 123456ms Fixation data (101.4, 101.0) timestamp: 123456ms” ]
Thank you

Cord 1(The base is minimal fixation data stream)————————————————————————————————–

Add
TX_GAZEPOINTDATAPARAMS params = { TX_GAZEPOINTDATAMODE_LIGHTLYFILTERED}; in29
success &= txCreateGazePointDataBehavior(hInteractor, &params) == TX_RESULT_OK; in38

(omit)

void OnGazeDataEvent (TX_HANDLE hGazeDataBehavior)
{
TX_GAZEPOINTDATAEVENTPARAMS eventParams;

if (txGetGazePointDataEventParams(hGazeDataBehavior, &eventParams) == TX_RESULT_OK)
{
printf(“\n\nGaze: (%.1f, %.1f) timestamp %.0f ms\n”, eventParams.X, eventParams.Y, eventParams.Timestamp);
}

else {
printf(“Failed to interpret fixation data event packet.\n”);
}
}
//———————————————↑get gaze data ——————————————————
//———————————————–↓get fixation data ————————————————–

Initializes g_hGlobalInteractorSnapshot with an interactor that has the Fixation Data behavior.

void OnFixationDataEvent(TX_HANDLE hFixationDataBehavior)
{
TX_FIXATIONDATAEVENTPARAMS eventParams;
TX_FIXATIONDATAEVENTTYPE eventType;
char* eventDescription;

if (txGetFixationDataEventParams(hFixationDataBehavior, &eventParams) == TX_RESULT_OK){

eventType = eventParams.EventType;
eventDescription = (eventType == TX_FIXATIONDATAEVENTTYPE_DATA) ? “Data”
: ((eventType == TX_FIXATIONDATAEVENTTYPE_END) ? “End”: “Begin”);

printf(“Fixation [%s]: (%.1f, %.1f) timestamp %.0f ms\n”, eventDescription, eventParams.X, eventParams.Y ,eventParams.Timestamp);}

else {printf(“Failed to interpret fixation data event packet.\n”);}
}

//Callback function invoked when an event has been received from the EyeX Engine.

void TX_CALLCONVENTION HandleEvent(TX_CONSTHANDLE hAsyncData, TX_USERPARAM userParam)
{
TX_HANDLE hEvent = TX_EMPTY_HANDLE;
TX_HANDLE hBehavior = TX_EMPTY_HANDLE;

txGetAsyncDataContent(hAsyncData, &hEvent);

// NOTE. Uncomment the following line of code to view the event object. The same function can be used with any interaction object.
//OutputDebugStringA(txDebugObject(hEvent));

if (txGetEventBehavior(hEvent, &hBehavior, TX_BEHAVIORTYPE_FIXATIONDATA) == TX_RESULT_OK)
{OnFixationDataEvent(hBehavior); txReleaseObject(&hBehavior);}

if (txGetEventBehavior(hEvent, &hBehavior, TX_BEHAVIORTYPE_GAZEPOINTDATA) == TX_RESULT_OK)
{OnGazeDataEvent(hBehavior); txReleaseObject(&hBehavior);}

// NOTE since this is a very simple application with a single interactor and a single data stream,
// our event handling code can be very simple too. A more complex application would typically have to
// check for multiple behaviors and route events based on interactor IDs.

txReleaseObject(&hEvent);

}
(Omitted below)