Home › Forums › XR Discussions › Enable Gaze Aware NPCs in VR
- This topic has 4 replies, 2 voices, and was last updated 5 years, 1 month ago by Grant [Tobii].
- AuthorPosts
- 30/10/2019 at 15:33 #12209RanaParticipant
Hi,
I wanted to ask if the feature “Enable Gaze Aware NPCs” works in VR I’ve been having some trouble enabling it. Also, can I use it with the TobiiXR SDK or only Tobii for Desktop SDK? Thank you
31/10/2019 at 09:26 #12215Grant [Tobii]KeymasterHi @ranaelbastawisy and thanks for your query. Indeed, you can implement Gaze Aware Non Player Characters in your application using the Tobii XR SDK.
We discuss this feature @ https://vr.tobii.com/sdk/design/use-cases/social/basics/
You can download the XR SDK freely @ https://vr.tobii.com/sdk/downloads/ Please do check it out and let us know if you any further questions.
31/10/2019 at 14:35 #12219RanaParticipantHi again,
Thank you for your prompt response.
In the link you’ve provided, I can not find a guide of how the enable gaze aware for NPCs implemented nor can I find the feature after downloading the SDK. I hope you could further help me with this matter. Thank you
31/10/2019 at 18:46 #12220RanaParticipantAlso in the guide I can find that it says “Unity Examples included in the Tobii XR SDK” which are “Object Mapping”, “Social”, “User Interface” and “Hand-Eye coordination” you can find it on this link https://vr.tobii.com/sdk/develop/unity/unity-examples/
And after importing tobiiXR SDK into my project the mentioned above examples are the ones I found.
Please if there is something I am missing for the “Enable Gaze Aware for NPCs” to implement in VR! please let me know.Thank you
01/11/2019 at 14:06 #12241Grant [Tobii]KeymasterHi @ranaelbastawisy, unfortunately, we do not provide any samples that explicitly show how to implement NPS at this time.
Non-Player Characters responding to receive gaze signals is simply a concept whereby you can increase the immersion of a player within a game and control the NPS reaction according to dwell time, etc. However, the actual coding part is simply a matter of mapping to an object so that upon receiving gaze from the user, actions will be triggered.
The syntax for this is covered in the Unity Sample documentation at it’s most basic level @ https://vr.tobii.com/sdk/develop/unity/getting-started/vive-pro-eye/#step-7-create-a-cube-and-place-it-somewhere-in-the-scene
with more sophisticated mapping in the “object mapping” section of the Unity Samples page. Apologies for any inconvenience, hopefully this is enough to get you started.
- AuthorPosts
- You must be logged in to reply to this topic.