Home Forums Software Development Get the Head Positions

Viewing 6 posts - 1 through 6 (of 6 total)
  • Author
    Posts
  • #10194
    Pedro Coelho
    Participant

    Hello, I have a Tobii 4C Eye tracker and I am using the Tobii Core SDK in a C# program.

    I am stuck in getting the coordinates of the head. I already looked in the CoreSDK example program but I modified a bit but I don’t know if it was the best solution.

    I want to get the coordinates of the head and send them by a function but it gives me 0 and I am not sure what to use in 2nd and 3rd arguments that have Vector3 classes. What I have now is:

    private readonly Host _host;
    private static HeadPoseStream _headPoseStream;
    private double headX;

    public GazeDataPoints()
    {
    _host = new Host();

    _headPoseStream = _host.Streams.CreateHeadPoseStream();
    }

    public List<Double> sendHeadPositions()
    {
    List<double> headPositions = new List<double>();

    _headPoseStream.HeadPose((headPosX, headPosY, headPosZ) =>
    {
    headX = headPosX;

    });

    headPositions.Add(headX);

    return headPositions;

    }

    If you could help me it would be great.
    Thank you

    #10388
    Grant [Tobii]
    Keymaster

    Hi @wrapcaesar and thanks for your query. Before proceeding, it is important to stress that when using the Tobii 4C in conjunction with the Core SDK for the long term storage of any gaze data for analysis purposes, this requires the purchase of special analytical use licence.

    You can read more about this @ https://analyticaluse.tobii.com/

    If however your data is purely used for interaction purposes, then no further action is required. It would be useful to hear about your project intentions going forward.

    That being said, The issue in your code is due to assigning a double variable to PosY & PosZ which are of the form Vector 3.

    Please find below a simple code to illustrate a more appropriate methodology.

    using System;
    using Tobii.Interaction;
    
    namespace ConsoleApp2
    {
        class Program
        {
            static void Main(string[] args)
            {
                    var host = new Host();
                    var headPoseStream = host.Streams.CreateHeadPoseStream();
    
                    headPoseStream.HeadPose((headPosX, headPosY, headPosZ) => AddList(headPosX,headPosY,headPosZ));
            }
    
           static void AddList(double _headPosX, Vector3 _headPosY, Vector3 _headPosZ)
            {
                //add to list element
            }
        }
    }

    You should be able to modify this to suit your needs. Of course you will need to convert the PosY and PosZ format and concatenate with PosX to add to a double format list.

    Please let us know if we can be of any further assistance. Best Wishes.

    #10415
    Pedro Coelho
    Participant

    Hello Grant.
    Ok thank you for your reply and your help. Yes, I will not use the software program to store or transfer any data nor research, it will be only for interaction programs with user.

    All right, I understood. I did what you wrote in the code and I could get the coordinates of the head. What I wanted to do is a rearrangement of the X and Y coordinates of the eyes, like a new set of calibration. For example: if the user’s head is more inclined to the left or right I want to set a new gaze trace for the eyes. How can I do that or what functions can I use?

    I pushed my code to a GitHub repository if you want to take a look: https://github.com/pedrocoelho100/Gaze

    Thank you

    #10521
    Grant [Tobii]
    Keymaster

    Hi @wrapcaesar, OK thanks for the confirmation that your application is interactive use only.

    I am afraid however, it is not entirely clear what you are trying to achieve. I understand you wish to somehow circumvent the need for a new calibration per user based on head position? Or that the gaze co-ordinates should be modified to a specified margin according to head position?

    Certainly, every new user requires a new calibration and any attempt to use the same calibration (even with modification) will almost certainly result in very poor tracking quality if any tracking at all is possible.

    The Tobii hardware automatically compensates for head movement to determine gaze position on screen, so no action is required from the user end to implement this. What do you mean by ‘set a new gaze trace’?

    Thanks for helping to clarify the situation, hopefully we can help you accordingly.

    #10526
    Pedro Coelho
    Participant

    Hello Grant.
    Yes, what I want is that the gaze co-ordinates should be modified to a specified margin according to the head position. I know that the Tobii hardware automatically compensates for head movement but sometimes it has some margin that could be corrected if, for example, the head movements have higher angles and more distantiated from the Tobii Eye Tracker.

    What I was thinking was getting the head positions and, based on those co-ordinates, adjust the gaze margin.
    The other idea was recalibrate the Tobii Eye Tracker every time the user moves his head to a different fixed position. Then some dots could appear to make a new calibration.
    What do you think it would be the best approach?

    Thank you

    #10542
    Grant [Tobii]
    Keymaster

    Hi @wrapcaesar, okay thanks for the clarification and certainly you can in theory modify the gaze data by a pre-specified margin by assigning a new variable from the reported gaze locations and modifying that accordingly. There is no means to adjust the reported gaze data directly, but I don’t think this should pose an issue for you?

    Recalibrating each time a user moves the head to different position is unlikely to yield any useful results as the calibration is always optimised to work within the established tracking box at all extremities of head rotation and distance from tracker.

    Overall, the idea to modify reported gaze location according to head rotation is not something that we would officially recommend as this is already internally calculated but of course you are free to experiment and see if this yield fruit for your project.

    Were you experiencing poor tracking results and as a result decided to take this approach? Do you intend to run the application with multiple different users?
    Indeed, there are a number of things you can try to improve tracking accuracy such as avoiding strong sources of IR light near the eye tracker such as open windows, spot lights, etc… also, you can try to ‘improve calibration’ via the Tobii System Icon tool.

Viewing 6 posts - 1 through 6 (of 6 total)
  • You must be logged in to reply to this topic.