- 09/04/2020 at 11:08 #14109
Good day. I am a Linux-based developer working for the institute of psychology of the university of Bern, Switzerland.
I am interested in receiving the stream of eye vectors from a Vive Pro Eye from Linux. I cannot use the SRanipal SDK as it is windows-only.
I downloaded the Tobii support package for Unity (TobiiXRSDK_22.214.171.124.unitypackage), which includes Linux shared library libtobii_stream_engine.so, whose content matches the API described here:
I thus would be able to link to it from C. The problem is that on this page:
in the Stream Engine paragraph, as per “support for the HTC Vive Pro Eye”, I can read:
Contact us. Available for special use cases.
Since this appears to be the only viable route to obtaining linux support for basic view ray streaming from the Vive Pro, I ask: what does ‘special use cases’ mean? Could I be one of such cases?
Thanks in advance09/04/2020 at 17:52 #14116
Hi @flfl and thanks for your query. The Tobii XR SDK works on top of the SRandpial’s own SDK. As you may read on HTC on website @https://forum.vive.com/topic/7012-vive-pro-eye-on-ubuntu-16-or-18/ they do not currently or plan to support Linux with the Vive Pro so unfortunately so long as the manufacturer does not expose their data through this platform, Tobii’s own SDK cannot access it.
Apologies for the disappointing news, perhaps you can find some emulator workaround such as Wine or Virtual Box to accommodate your needs.14/04/2020 at 08:32 #14137
Thanks for your reply, Grant. Disappointing indeed. I have read HTC’s EULA for the sdk, and it is not signable (forbids any attempt to use the SDK in different ways than intended), so I did not download the SDK. HTC/Valve are trying to lock people inside their ecosystem (which is mostly gaming-oriented). This is absolutly unsuitable for my work.
Do I understand correctly that your own library code converses with an HTC-developed interface that takes care to pass the conversation via USB to the hardware you license?
In this case, is this layer totally opaque to you (meaning that you are forbidden to explore alternative ways to speak with your hardware), or do you not try to write code for a more transparent and multi-architecture communications channel because of lack of time or interest? In the second case, I might try to see if the Uni could donate some of my time. At the end, it would be sufficient to code the right amount of encapsulations…
I have a further question: I see the previous model can still be found somewhere. For example:
Do data have to travel through HTC-provided libraries in that case too? Or could I have a linux-based solution for eyetracking out of the box with that device?
Thanks in advance14/04/2020 at 11:54 #14141
Hi @flfl, yes you are correct that our own library does converse with that of HTC to provide the XR SDK so the low-level layer is ‘opaque’ yes.
For the time being, we are focusing on the Windows platform under VR which provides that grand majority of software currently available, so Linux support is unlikely to be prioritised in the near future.
Accordingly, it may be worth your time to petition HTC directly on this point.
With regard to the Tobii Pro HTC Vive VR, this was sold via the Tobii Pro Business Department, so please directly queries regarding this hardware to their dedicated team @ https://www.tobiipro.com/contact/sales/14/04/2020 at 12:58 #14143
Thanks for your reply. From first impression, I prefer to stay clear of either HTC or Valve. They seem to be totally oblivious of the needs of the Linux world.
Would people from a *business department* be able to answer my technical question? Before even starting to consider searching for one of those older devices, I want to be sure about whether I would be able to bypass the whole HTC universe in order to receive the stream of view rays I need. This is the specific question I was posing above. I was originally directed to this forum by Ms. Leopoldine Brand, “Inside Sales Specialist – Tobii Pro”. Do you happen to know about a tech contact, instead of a business one?14/04/2020 at 14:00 #14144
@flfl, once in touch with Tobii Pro, they should be able to redirect you the appropriate team to deal with your query.
This forum is intended for development questions related to the Tobii Tech range of eye trackers (which include the Vive Eye Pro but not the older deprecated Pro HTC Vive) and their associated SDK’s.
As you already have a contact within Tobii Pro Sales, then kindly ask that this specific query regarding Tobii Pro HTC Vive & Linux is forwarded to the Pro Support team. Best Wishes.14/04/2020 at 14:17 #14145
Ok. I had supposed that there was one Tobii. It now looks like you and Tobii Pro are separate galaxies. Good to know. This conversation helped me in defining for good that, as per the current time coordinates, the eyetracker in the Vive Pro Eye is unusable from Linux. At least I know this for sure.
All the best for your future.
Carlo06/05/2020 at 11:37 #14221
Hello Carlo, I was wondering if you managed to find a suitable solution for your project needs? Many thanks for any relevant updates.06/05/2020 at 12:45 #14224
The only sad conclusions I could reach as a result of my foray into tobii/vive territory are these:
a) Tobii accepts that the only way for the world to access its hardware contained in the vive pro eye is through a closed-source windows-only library (called something similar to ‘sranipal’).
b) Vive (HTC/Valve) could not care less about Linux, and seems not too moved at the idea that their products could be used in research environments, and/or anywhere else than in strictly bordered windows-only gaming circumstances (I refused to download this ‘sranipal’ library because in their licence any usage scenario different from the original one (i.e. leisure under windows) is explicitly forbidden.
c) The only viable path for me to try to access to the view ray data (necessarily under Linux, directly managing the USB-level conversation with the device) seems to pass through reverse engineering, which is at the present not a viable path.
So, the project is all but stopped, not for lack of good will on my side.
I have a question, if you wish and are able to answer. Where does the integration of the eye images take place in order to compute the viewing rays? Is it done in hardware aboard the device, or do you upload the images and perform the computation in your library on the host computer?
I ask this because the researchers I work for would be quite interested in also obtaining all or part of the raw images captured by the eyetracer (mainly in order to perform statistics about the change of pupil size in connection with stimuli). If the image stream travels to the computer, it would be possible, at least in theory, to capture all or some of those images.
Thanks in advance for any answer that you will provide.06/05/2020 at 17:36 #14227
Hi Carlos, sorry to hear about that. Indeed Eye Tracking of any type on Linux is still a very small user base for now, and as even regular VR has limited Linux support, so the idea of doing both on such a system seems like a non-starter for the time being, unfortunately.
The eye-tracking algorithms are performed onboard the device so I am afraid that capturing the eye images would not be possible either, apologies.08/05/2020 at 13:38 #14229
All you say must be true, but I know that, if the low-level library/libraries that are now used to communicate with the device were written in a clean way, so that the actual messages exchanged, their meaning and their timing could be clearly desumed, and I were to be given the source code, I most probably could replicate the conversation using the Linux hidapi interface. After all, I already can get the gyroscope and lighthouse data that way.
Some years ago I have had some experience with (unsuccessfully) trying to write eyetracker image-to-viewray code. *THAT* is difficult stuff. Replicating USB traffic is not.
Thanks for the chat.
You must be logged in to reply to this topic.