Home Forums Software Development Multi-display support

Viewing 13 posts - 1 through 13 (of 13 total)
  • Author
    Posts
  • #959

    According to the documentation, Tobii REX supports multi-display setups with adjacent monitors. The user calibrates on one monitor and all coordinates are relative to that. The set of monitors is treated as a single virtual screen.

    I am working on a scenario with multiple screens that are not close to each other. It would be great if Tobii REX detects at what screen a user is gazing and allows eye tracking on these screens.

    The displays are placed such that the user’s eyes are always visible for Tobii REX when a user stares at one of the screens.

    Are there plans for supporting such a scenario with a single Tobii REX?

    Kind regards,

    Jan Willem

    #961
    Jenny [Tobii]
    Participant

    Hi Jan Willem,

    The most important limitation to consider for your scenario is the maximum screen plane size supported by the Tobii REX. The REX tracks well on screen sizes up to 24″, above that tracking will be less and less good the further away from the tracker the gaze-point is. (This has to do with viewing angles and the distance between the eye tracker’s illuminators).

    As you describe it, it sounds like you want to use one REX to cover a much larger physical area than 24″? Would your system work well with lower tracking accuracy outside the bounds of 24″?

    If you think your system will be feasible with the above size and accuracy limitations, the next thing is how to get a non-adjacent multiple screen setup to work with the REX.

    Advanced multiple screen setup with non-adjacent monitors is not supported by the EyeX Engine. The only possibility is to use the Gaze SDK.

    The Gaze SDK will give you normalized gaze-points, where (0,0) is the upper left corner and (1,1) is the lower right corner of the eye tracker’s screen plane. Gaze points that fall outside the bounds of the screen plane will have values lower than 0 and/or higher than 1.

    You would have to provide the REX with a screen plane defined by you, and map the normalized gaze point values to physical pixels on the respective monitor, taking into consideration the relative physical positions of the monitors. In other words, it would take some trigonometry calculations to get it working, and the screens would have to be fixed in their relative positions.

    How to do a screen setup in the Gaze SDK is described in the Gaze SDK documentation.

    #965

    Hi,

    Thanks for your response!

    We can work with limited accuracy outside a physical area of 24″. I am actually already using the Gaze API so there is no problem there either. It’s an in-car scenario where we have multiple displays that are fixed in the car. So it should be feasible!

    Kind regards,

    Jan Willem

    #7254
    Yann
    Participant

    Hi Tobii team,
    Sorry to dig out this old topic, but it is the one which responds the best to my question: how to perform multi-display support?
    I am working in the same kind of environment as Jan Willem, namely an in-car scenario. I understand what Jenny said and wish to do it, but can’t find which functions of the SDK have to be used to perform a multi-screen setup. Where is it exactly in the SDK documentation?

    Thank you by advance,
    Yann Roussel

    #7258
    Grant [Tobii]
    Keymaster

    Hi @yannrl7, The old Tobii Gaze SDK is no longer available and has since been replaced with the Tobii Stream Engine API available @ http://developer.tobii.com/tobii-core-sdk/

    However, at the current time neither the Interaction nor Stream Engine API supports multiple monitor setups and as such is not documented.

    That being said, the previous advice holds in that the normalised gaze co-ordinates lie within the values between 0 and 1 for the active screen being used. Negative values or those greater than 1 represent captured gaze data that lies out with the active screen but still detectable by the eye tracker which could theoretically be used by your program to map onto another adjacent screen, but I am afraid this is something we do not support for the time being. I hope this clears up your enquiry.

    #7289
    Yann
    Participant

    Thank you for your answer. So just to be clear; is the use of normalised coordinates (0 to 1 for the active screen, negatives and superior to 1 for the exterior) still possible with the stream engine API, or is there some comparable method?

    I would need, at least, to force the Tobii continuing sending its stream of coordinates when I am looking elsewhere than the active screen.

    PS: just for information, I am using a Tobii 4C

    #7291
    Grant [Tobii]
    Keymaster

    Hi @yannrl7, yes that’s correct the Stream Engine offers the Gaze Co-ordinates in normalised format.

    Further details of the Gaze Data stream available online @ https://tobii.github.io/stream_engine/#tobii_streamsh

    Please feel free to share your code and experiences once your project is up and running for the benefit of other users and of course let us know if you hit any further problems.

    #7308
    Yann
    Participant

    Hi Grant, hi Tobii,
    My idea would be using an overhead projector to calibrate the Tobii with it.
    I need to use the Tobii with a quite big screen (I know there is a loss of accuracy but it is still precise enough).
    Then I am using a Matlab Toolbox to get the gaze coordinates (in pixels) and will do the math to have it in mm.

    No problem to calibrate with the overhead projector, it worked, and the coordinates where correctly given.
    But I do not have access to the projector constantly, so my question would be:
    Is there a way to overrule the Tobii eye-tracking re-calibration request, each time you connect/disconnect a screen?
    Is there a way to access to the calibration profiles?

    My project aim is to integrate the Tobii in a car, to follow the driver’s gaze, so with no screen at all.
    But the first step would be keep using the projector’s calibration with smaller screen.

    Thank you by advance,
    Yann

    #7310
    Grant [Tobii]
    Keymaster

    Hi @yannrl7, that sounds like a pretty cool project indeed!

    Regarding your request, there is a possibility to transfer calibration profiles across systems perhaps but in terms of altering parameters such as calibration information per point, then this is not possible.

    You mean you wish to induce a recalibration or to prevent one when changing screens? Certainly, once the software detects a screen change then the existing calibration profile will not work, but the Stream Engine SDK may allow you to continue regardless. Have you looked at this lower level API we offer?

    If you could kindly be explicit in what you want to do on both the above points, I will see what solution we can find for you. Thanks.

    #7311
    Yann
    Participant

    What I mean is your second proposition: prevent recalibration when changing screens (in my case, when disconnecting the projector).
    This way, I perform a calibration with the projector, mesure the physical “screen” dimensions and the Tobii tracker respective position, and then I can adapt it to a car cockpit.

    #7317
    Grant [Tobii]
    Keymaster

    Hi @yannrl7, ok thanks for the clarification. Yes, as I posted previously, it seems the only way to circumvent this is through the low level Stream Engine API for which little support is provided (As necessitated by the extremely basic nature of the API).

    Have you tried already looking at this as a possible solution? Certainly, via the Interaction API, the ability to avoid recalibration when changing screens is not available.

    Details of the Stream Engine online @ http://developer.tobii.com/tobii-core-sdk/

    Something else to consider potentially is using the Tobii Pro SDK which is compatible with your eye tracker, however this requires a special licence to operate but again may be better suited for the custom environment you wish you work with.

    You can read more about the Tobii Pro SDK @ http://developer.tobii.com/tobii-pro-sdk/

    #7318
    Yann
    Participant

    Hi Grant,
    I already tried all the Tobii core SDK possibilities, and accessing to any part of the calibration process or calibration profiles isn’t part of these. I find it unfortunate because it drastically limits development possibilities.

    My persistent problem is the display_area variable set during the calibration, which prevents the eyetracker to send data when I look outside the calibrated screen. As Jenny said before, Tobii keeps tracking outside of these bounds ((0;0) to (1;1) in normalized coordinates), but in a very limited region:
    From -0.2 to 1.2 for X,
    From -0.2 to 2 for Y

    This is why I tried calibrating with a projector, which offers a much bigger physical gaze tracking zone, but I cannot really take advantage of it if I cannot prevent recalibrations.
    It is regrettable I would be forced to buy an expensive licence (+10 times the price of my Eye4C device) just to benefit from this one additionnal function.

    Is there really no way to access to these functions without the Pro SDK?
    https://tobii.github.io/CoreSDK/api/Tobii.Interaction.Framework.ConfigurationTool.html

    #7354
    Grant [Tobii]
    Keymaster

    Hi @yannrl7, apologies for the delay in response, I have been trying to gather the appropriate information for your project.

    Unfortunately, it appears based on the information you have provided it will be necessary to use a Tobii Pro SDK licence based on the following points

    You mentioned that you intend to store the gaze data for use in Matlab. The storage of gaze is explicitly prohibited by the Tobii Core SDK licence agreement and requires a special licence to do so.

    In addition, it seems the best way for you to achieve what you need rather than calibrating on a monitor first, then changing screen and preventing re-calibration (which is not actually something any SDK supports) would be to use the Pro SDK which supports alternate screen setups such as calibrating with projectors directly.

    I would suggest you contact the SDK licencing team for an appropriate quote: [email protected]

    That being said, perhaps the previous user Jan Willem who was working on a similar project (@tttobitt) may be able to offer to you a workaround that avoids the change of SDK. Sorry this was not what you wanted but hopefully we can find for you a solution that meets your needs and budget.

Viewing 13 posts - 1 through 13 (of 13 total)
  • You must be logged in to reply to this topic.