The Hand Controller and Eyes Controller for the XR Rig are provided, but nothing happens when they are activated and running.
What are the Hand Controller and Eyes Controller intended for?
These objects have not have been fully implemented and therefore are disabled by default. In the future these object will use the data provided by the hand tracking and eye tracking API to transform the position and rotation of the transform.
Hi!
I was trying to use HandController, but it doesn't do anything, the ray line is visible in the App, but at 0 0 0 coordinates. Any update coming soon to be able to utilise Hand and Eye controllers in Unity?
If it's a feature that doesn't work and is yet to be released, why do we need it in the first place? If it is added as a component, it will be mistakenly assumed to be available.
Unity Editor version : 2022.2.18 ML2 OS version :1.2.0 MLSDK version :1.2.0 Unity SDK version :1.6.0 Host OS : (Windows)
Hello. I am still having an issue with Eyes Controller of XR Rig. When I enable the gameobject, it doesn't do anything. I can see the ray but the ray is not attached. I can read Eyes positional and rotations values from Tracked Pose Driver. But the ray interactor of XR Rig does not work. Is it still developing?
The Eye Tracking needs to be called explicitly via script in the scene in order for it to initialize. You can take a look at the Eye Tracking example scene in the Unity examples or read about the Eye Tracking API initialization in the guide here.