Home Forums XR Discussions Vive Pro Eye with stream engine

Viewing 11 posts - 1 through 11 (of 11 total)
  • Author
    Posts
  • #14109
    Carlo Prelz
    Participant

    Good day. I am a Linux-based developer working for the institute of psychology of the university of Bern, Switzerland.

    I am interested in receiving the stream of eye vectors from a Vive Pro Eye from Linux. I cannot use the SRanipal SDK as it is windows-only.

    I downloaded the Tobii support package for Unity (TobiiXRSDK_1.8.0.168.unitypackage), which includes Linux shared library libtobii_stream_engine.so, whose content matches the API described here:

    https://vr.tobii.com/sdk/develop/native/stream-engine/api/

    I thus would be able to link to it from C. The problem is that on this page:

    https://vr.tobii.com/sdk/develop/native/

    in the Stream Engine paragraph, as per “support for the HTC Vive Pro Eye”, I can read:

    –8<—-8<—-8<—-8<—-8<—-8<—-8<—-8<–
    Contact us. Available for special use cases.
    –8<—-8<—-8<—-8<—-8<—-8<—-8<—-8<–

    Since this appears to be the only viable route to obtaining linux support for basic view ray streaming from the Vive Pro, I ask: what does ‘special use cases’ mean? Could I be one of such cases?

    Thanks in advance

    #14116
    Grant [Tobii]
    Keymaster

    Hi @flfl and thanks for your query. The Tobii XR SDK works on top of the SRandpial’s own SDK. As you may read on HTC on website @https://forum.vive.com/topic/7012-vive-pro-eye-on-ubuntu-16-or-18/ they do not currently or plan to support Linux with the Vive Pro so unfortunately so long as the manufacturer does not expose their data through this platform, Tobii’s own SDK cannot access it.

    Apologies for the disappointing news, perhaps you can find some emulator workaround such as Wine or Virtual Box to accommodate your needs.

    #14137
    Carlo Prelz
    Participant

    Thanks for your reply, Grant. Disappointing indeed. I have read HTC’s EULA for the sdk, and it is not signable (forbids any attempt to use the SDK in different ways than intended), so I did not download the SDK. HTC/Valve are trying to lock people inside their ecosystem (which is mostly gaming-oriented). This is absolutly unsuitable for my work.

    Do I understand correctly that your own library code converses with an HTC-developed interface that takes care to pass the conversation via USB to the hardware you license?

    In this case, is this layer totally opaque to you (meaning that you are forbidden to explore alternative ways to speak with your hardware), or do you not try to write code for a more transparent and multi-architecture communications channel because of lack of time or interest? In the second case, I might try to see if the Uni could donate some of my time. At the end, it would be sufficient to code the right amount of encapsulations…

    I have a further question: I see the previous model can still be found somewhere. For example:

    https://imotions.com/hardware/tobii-htc-vive-vr-headset/

    Do data have to travel through HTC-provided libraries in that case too? Or could I have a linux-based solution for eyetracking out of the box with that device?

    Thanks in advance

    #14141
    Grant [Tobii]
    Keymaster

    Hi @flfl, yes you are correct that our own library does converse with that of HTC to provide the XR SDK so the low-level layer is ‘opaque’ yes.

    For the time being, we are focusing on the Windows platform under VR which provides that grand majority of software currently available, so Linux support is unlikely to be prioritised in the near future.

    Accordingly, it may be worth your time to petition HTC directly on this point.

    With regard to the Tobii Pro HTC Vive VR, this was sold via the Tobii Pro Business Department, so please directly queries regarding this hardware to their dedicated team @ https://www.tobiipro.com/contact/sales/

    #14143
    Carlo Prelz
    Participant

    Thanks for your reply. From first impression, I prefer to stay clear of either HTC or Valve. They seem to be totally oblivious of the needs of the Linux world.

    Would people from a *business department* be able to answer my technical question? Before even starting to consider searching for one of those older devices, I want to be sure about whether I would be able to bypass the whole HTC universe in order to receive the stream of view rays I need. This is the specific question I was posing above. I was originally directed to this forum by Ms. Leopoldine Brand, “Inside Sales Specialist – Tobii Pro”. Do you happen to know about a tech contact, instead of a business one?

    #14144
    Grant [Tobii]
    Keymaster

    @flfl, once in touch with Tobii Pro, they should be able to redirect you the appropriate team to deal with your query.

    This forum is intended for development questions related to the Tobii Tech range of eye trackers (which include the Vive Eye Pro but not the older deprecated Pro HTC Vive) and their associated SDK’s.

    As you already have a contact within Tobii Pro Sales, then kindly ask that this specific query regarding Tobii Pro HTC Vive & Linux is forwarded to the Pro Support team. Best Wishes.

    #14145
    Carlo Prelz
    Participant

    Ok. I had supposed that there was one Tobii. It now looks like you and Tobii Pro are separate galaxies. Good to know. This conversation helped me in defining for good that, as per the current time coordinates, the eyetracker in the Vive Pro Eye is unusable from Linux. At least I know this for sure.

    All the best for your future.

    Carlo

    #14221
    Grant [Tobii]
    Keymaster

    Hello Carlo, I was wondering if you managed to find a suitable solution for your project needs? Many thanks for any relevant updates.

    #14224
    Carlo Prelz
    Participant

    Hi Grant.

    The only sad conclusions I could reach as a result of my foray into tobii/vive territory are these:

    a) Tobii accepts that the only way for the world to access its hardware contained in the vive pro eye is through a closed-source windows-only library (called something similar to ‘sranipal’).

    b) Vive (HTC/Valve) could not care less about Linux, and seems not too moved at the idea that their products could be used in research environments, and/or anywhere else than in strictly bordered windows-only gaming circumstances (I refused to download this ‘sranipal’ library because in their licence any usage scenario different from the original one (i.e. leisure under windows) is explicitly forbidden.

    c) The only viable path for me to try to access to the view ray data (necessarily under Linux, directly managing the USB-level conversation with the device) seems to pass through reverse engineering, which is at the present not a viable path.

    So, the project is all but stopped, not for lack of good will on my side.

    I have a question, if you wish and are able to answer. Where does the integration of the eye images take place in order to compute the viewing rays? Is it done in hardware aboard the device, or do you upload the images and perform the computation in your library on the host computer?

    I ask this because the researchers I work for would be quite interested in also obtaining all or part of the raw images captured by the eyetracer (mainly in order to perform statistics about the change of pupil size in connection with stimuli). If the image stream travels to the computer, it would be possible, at least in theory, to capture all or some of those images.

    Thanks in advance for any answer that you will provide.

    #14227
    Grant [Tobii]
    Keymaster

    Hi Carlos, sorry to hear about that. Indeed Eye Tracking of any type on Linux is still a very small user base for now, and as even regular VR has limited Linux support, so the idea of doing both on such a system seems like a non-starter for the time being, unfortunately.

    The eye-tracking algorithms are performed onboard the device so I am afraid that capturing the eye images would not be possible either, apologies.

    #14229
    Carlo Prelz
    Participant

    All you say must be true, but I know that, if the low-level library/libraries that are now used to communicate with the device were written in a clean way, so that the actual messages exchanged, their meaning and their timing could be clearly desumed, and I were to be given the source code, I most probably could replicate the conversation using the Linux hidapi interface. After all, I already can get the gyroscope and lighthouse data that way.

    Some years ago I have had some experience with (unsuccessfully) trying to write eyetracker image-to-viewray code. *THAT* is difficult stuff. Replicating USB traffic is not.

    Thanks for the chat.

Viewing 11 posts - 1 through 11 (of 11 total)
  • You must be logged in to reply to this topic.