Are you refering to the flashlight-like experience on the first page of the tutorial? I believe they have created a filter using a combination of the lightly filtered gaze point data stream and the fixation data stream. There is a feature request pending on the EyeX Engine to add a data stream suitable for visualization of where the user is looking. If the request is accepted, the data stream will be made available through the EyeX API.
To make a similar filter yourself, you can experiment with weighted averaging over the most recent gaze points (a certain number of points, excluding points that are too old), and do different amounts of filtering depending on if the eye movements are small or large. For example, average over a number of recent points if the distance is small between the new point and the latest, and do almost no averaging if the distance is large. That makes a filter which is stable when the eyes are focusing on something, and still responsive if the focus is moved to another object on the screen.