The Hand Controller and Eyes Controller for the XR Rig are provided, but nothing happens when they are activated and running.
What are the Hand Controller and Eyes Controller intended for?
These objects have not have been fully implemented and therefore are disabled by default. In the future these object will use the data provided by the hand tracking and eye tracking API to transform the position and rotation of the transform.
Hi!
I was trying to use HandController, but it doesn't do anything, the ray line is visible in the App, but at 0 0 0 coordinates. Any update coming soon to be able to utilise Hand and Eye controllers in Unity?
If it's a feature that doesn't work and is yet to be released, why do we need it in the first place? If it is added as a component, it will be mistakenly assumed to be available.