Hi @susanne, okay thanks for that. Yes, I can confirm that there is no longer the means to create your own custom calibration routine (as was with GazeSDK) unless you work with the Tobii Core SDK Stream Engine and purchase a special licence to do so.
What I meant previously was that a possible workaround would be to calibrate in a regular fashion on a screen and thereafter replace the screen with the stimulus such as a magazine, book, projected screen, etc.
However, now that you have explained you want to track in the Z-Direction as well, that does change the circumstances significantly. There is not setup we offer (including within the Tobii Pro of trackers) that will correctly track your gaze location in 3 Dimensions.
BTW, by 3D Gaze Vector, I assume you were referring to the Gaze Position (x,y,z) within the User Coordinate System as described @ http://developer.tobiipro.com/commonconcepts/coordinatesystems.html
For example, if you had an object close and an object far but both in the same line of sight, there would not be a way to determine which you were looking at, even with the 3D Gaze Vector. Such tasks are possible although only with head mounted or remote eye trackers such as the Tobii Glasses (https://www.tobiipro.com/product-listing/tobii-pro-glasses-2/)
With all other trackers including those from Tobii Tech (EyeX, 4C) & Tobii Pro (X60, T120, etc) the final gaze *location* is only ever given within a 2D Plane as initially calibrated.
All that being said, if your environmental setup is such that object are spaced reasonably far apart without not much overlap in the Z-Dimension, then you may be able to extract very approximate values for where the user looks within that space. If per chance there are any photos of your setup, that would be useful to see. Thanks.