Home Forums Software Development Using EyeX with OSX by using WindowsVM or separate Windows machine


Viewing 10 posts - 1 through 10 (of 10 total)
  • Author
  • #3885

    Disappointingly, EyeX only works on Windows.

    Could there be some way to get it working on OSX (and other platforms) by running a separate Windows machine and sending signals over UDP say?

    If I understand right, EyeX is quite demanding on the hardware’s CPU, so having a dedicated machine to handle the eye tracking would have the benefits of freeing up my development machine.

    So it might be possible to find some lightweight/miniature piece of hardware capable of running Windows, maybe I could bolt it onto the back of my LCD.

    Maybe something like http://www.intel.co.uk/content/www/uk/en/nuc/overview.html

    Although maybe I can cut corners further, what’s the minimum spec I need to support EyeX?

    Can anyone see the task in a little more detail?



    Annoyingly I can no longer edit my post, so I have to create a new one. Messy.

    More annoyingly, the forum logged me out without telling me and ate my reply when I submitted it. So I’m having to rewrite it.

    I found some potential candidates here http://www.pcworld.com/article/2911098/computers/mini-pc-invasion-10-radically-tiny-computers-that-fit-in-the-palm-of-your-hand.html#slide9 and here http://www.pcworld.com/article/2911098/computers/mini-pc-invasion-10-radically-tiny-computers-that-fit-in-the-palm-of-your-hand.html#slide12

    Can someone with knowledge of the resource consumption of EyeX recommend a suitable miniature PC? The main problem will be that of power I’m guessing. With EyeX connected into the miniature PC’s USB, this might forbid the miniature PC from then drawing power from a USB port as the combined consumption will probably exceed the USB three current limit specification.

    Tobii people, if you wish to consider yourselves the world leaders in this technology, you must support OSX and Linux.

    But rather than working on a complete software stack, which would take a team months, why not work on a Windows proxy System via UDP?

    That should be very lightweight coding — it would just sit on top of your existing API, communicating bidirectionally with any clients that have attached.

    Even calibration could be performed through this protocol.


    PS copying to clipboard before I send, please fix your forum!

    Mark Castle

    I use the EyeX on an older Macbook Pro using VMware Fusion. I created a virtual serial port in OS X and then attached it to a Windows 10 guest. I then wrote a few lines of code using the SDK that print out the eye coordinates to the virtual serial port. An automation script takes care of running the program each time guest starts up, and cleaning up afterwards. It was a pain to get worked out at first, but so far it’s been working great. There isn’t any noticeable delay, and I don’t have any performance issues. I use Python to monitor the serial connection and act accordingly, though any language with a serial library should work. If this sounds like something that might work for you, let me know and I’ll try to provide you with some more information. The source code isn’t on Github yet, as it’s sort of a mess and just something I used for a side project a while back.


    @Mark Castle, that is inspiring to hear!

    I’m worried about overburdening my MBA; I am running out of SSD and sometimes the fan goes crazy. How much load does the Virtual Windows + EyeX server put on your MacBook Pro? Is there any simple way of quantifying that? Maybe just RAM and %CPU…

    As regards the other solution path, I’m struggling to find a really small form factor Windows machine that supports USB3. There is a bunch of products with roughly the same form factor as an old Mac Mini, which I will call miniPC and a bunch of much smaller ones that look like a USB dongle which I will call microPC.

    MicroPC-s tends to only support USB2:


    Although that last one, there is a version 2 coming out soon which will expose a USB3 port:


    The most promising looking contender is much bigger than all of these:


    I’m hoping to find something really small, so that I can glue it to my LCD.



    @Mark Castle, I’m going to try your approach.

    If you can share any of the Python code, I am most grateful!

    I will try using VirtualBox as VMWare Fusion costs, so I will have to look up how to work the virtual serial ports.


    Mark Castle

    I just reviewed the source code I used, and it looks like I actually just wrote the gaze coordinates to a text file shared between the guest and host. I think I did this because it was more reliable than using a serial port. I’ll push it to Github later on today.


    Mark, did you encounter USB connectivity issues? I’ve been trying with VirtualBox and it disconnects frequently.

    I’m going to try again later today with VMWare.

    What year is your MacBook Pro?


    PS If I ever do get it working nicely on a VM, I’m going to look at a web sockets solution for sending data back to the host. Because this solution would extend to if I use a separate Windows Machine to do the tracking. Let me know if you get your project up on GitHub, as I think I could reuse much of the code.

    Mark Castle

    Yes, I did at first. I fixed it by allocating more resources to the guest and enabling USB 3.0 support.

    [Here’s a link to the program that runs in the guest and writes the gaze coordinates to a text file.](https://github.com/merktassel/research-project-2015/blob/master/Visual%20Studio/Tracker/Program.cs) I essentially just took the sample program that printed eye coordinates to the terminal and had it send the data to a text file instead.

    [Here’s the Python program that monitors the text file to obtain the gaze coordinates.](https://github.com/merktassel/research-project-2015/tree/master/Python) There’s also a Ruby version that I haven’t committed yet. It’s pretty straightforward, so you should be able to use any language so long as it can read from a text file fairly quickly.

    ![Screenshot of Host Specifications](https://www.dropbox.com/s/p4fcd48seowlbbn/Screenshot%202015-12-14%2009.56.21.png?dl=0)
    ![Screenshot of Guest Specifications](https://www.dropbox.com/s/p4fcd48seowlbbn/Screenshot%202015-12-14%2010.09.20.png?dl=0)


    Still no joy. My guess is that MBP2012 is just above the threshold for handling the USB3 datastream fast enough, and MBA2012 is just below.

    Interestingly I got further with VirtualBox than I did with VMWare.

    I have now used BootCamp to boot straight into Windows on my Mac, and it works perfectly.


    PS @MarkCastle, thanks for code!

    Andrew Long

    A cursory glance at the C/C++ SDK and the data stream information available on this site appears to show enough information for a motivated (and talented low level) individual to write their own “driver” for Mac or Linux, or any other Operating System.

    I might be wrong about this. Have any Mac or Linux heads had a look at the C/C++ SDK and data stream and thought about what’s required to get basic eye tracking working?

    It seems the data stream is able to give information in millimetres and pixels, and so long as the connected device is able to work at USB 3 rates, it should be possible to process and use this stream to do anything desired with that information. This is beyond my C/C++ skills and knowledge of Mac OSX, and I don’t use Linux (shame on me). Anyone know a guy with these abilities?

Viewing 10 posts - 1 through 10 (of 10 total)
  • You must be logged in to reply to this topic.