- 24/11/2014 at 14:37 #2111Jens AnderssonParticipant
The EyeX Interaction Tutorial’s behavior for fixation seems to have a good balance of latency, smoothness and stability. However, when I try the MinimalFixationDataStream sample, I’m not getting the same behavior. Are you doing additional processing on top of the fixation data in the Interaction Tutorial? If so, can you share that code?
Jens27/11/2014 at 15:06 #2121Jenny [Tobii]Participant
Are you refering to the flashlight-like experience on the first page of the tutorial? I believe they have created a filter using a combination of the lightly filtered gaze point data stream and the fixation data stream. There is a feature request pending on the EyeX Engine to add a data stream suitable for visualization of where the user is looking. If the request is accepted, the data stream will be made available through the EyeX API.
To make a similar filter yourself, you can experiment with weighted averaging over the most recent gaze points (a certain number of points, excluding points that are too old), and do different amounts of filtering depending on if the eye movements are small or large. For example, average over a number of recent points if the distance is small between the new point and the latest, and do almost no averaging if the distance is large. That makes a filter which is stable when the eyes are focusing on something, and still responsive if the focus is moved to another object on the screen.27/11/2014 at 15:35 #2122Jens AnderssonParticipant
Yes, the flashlight effect is what I was asking about.
I think being able to get that kind of data directly out of the SDK would be a great addition. In the meantime, I’ll see if I can replicate that behavior using your suggestions.
- You must be logged in to reply to this topic.