Hey ! So, the project that I am working on uses the controller to track a real world object by mounting the controller on the object. The controller will always be in direct view of the headset. I added hand tracking to the project and noticed that after doing so, whenever my hands are not super close to the controller, the controller tracking freezes ( I am guessing this is done for power saving) but in my case, I always want the controller to be tracked, since I am using it to track a real world object. Is there anyway for me to force the controller to always be tracked ?
Unity Editor version (if applicable): Unity 2022.3.30f1 ML2 OS version: 1.9.0 MLSDK version: 2.3.0
Also, I tried to turn off hand tracking when I do not need it by simply calling stop on the XRHandSubsystem instance and it stops tracking hand. But the problem is that, even now, if I move my hand away from the controller, the controller stops tracking
Unfortunately, the controller is not set to be tracked after it has been put to sleep. However, if the unused controller remains stationary, you can save the last tracked pose of the controller. If tracking the controller while it's asleep is necessary for your use case, I will go ahead and create a voice of customer ingest for this issue.
hey @etucker !
When you mentioned "put to sleep," I'm not sure what you meant because the controller isn't stationary at all and isn't really turned off. My issue is that I need to use both hand tracking and controller tracking simultaneously, even when the hand isn't close to the controller.
Even if I’m okay with this behavior (losing controller tracking when hand is away) when hand tracking is active, the problem continues even when the hand tracking subsystem is stopped. This doesn't happen when I fully disable the Hand Tracking feature in XR Plugin Management—when it's completely disabled in the project settings, the controller tracks properly, even if my hands aren't on the controller.
Ideally, I'd like to force controller tracking to work all the time, even with the Hand Tracking feature enabled. At the very least, I'd expect this issue to stop when I turn off the hand tracking subsystem.
Unfortunately, hand tracking and and controller tracking are not a part of our roadmap at the moment. However, I will create a voice of customer ingest to let them know that hand tracking should not take over when the subsystem is stopped.
Okay... any idea how long it would take to address the voice of the customer ingest ? Because, I'm pretty sure there will be a lot of applications that will require the hand tracking only at a certain part of the application. And for those applications, the controller going into standby when the hand is away, even when the handtracking is set to be off is kinda not good.
I am sorry, Input binding path ? can you elaborate ?
To clear any misunderstandings, I will explain my use case :
We've mounted the controller on a real gun, and use its position and orientation to track the gun in Unity. That part works perfectly.
To make things easier for the user, we wanted them to interact with UI using hand tracking instead of the mounted controller. So, we enabled hand tracking in the XR Plugin Management. But after enabling it, a strange behavior started: whenever the user moves their hand away from the controller, the controller tracking stops(assuming, its for performance reasons, but this doesn't help our case). This doesn’t happen when hand tracking Feature inside the XR Plugin Management is disabled—then the controller is always tracked, no matter where the user's hands are. To make the controller tracking work again, the user has to move their hand back to it.
Now, I understand that hand tracking + controller tracking support is not on your roadmap, but even when we stop hand tracking subsystem during runtime, the controller still stops tracking when hands move away. Ideally, it should behave as if hand tracking was never enabled in the XR Plugin Management when we turn the subsystem off.
For the object in unity that tracks the controllers position, what component are you using? Are you using a custom script that reads the controller position and rotation or are you using the XR Controller component?
public class ControllerScript : MonoBehaviour
{
// Start is called before the first frame update
void Start()
{
}
// Update is called once per frame
void Update()
{
if (PlayerInputManager.Instance == null)
return;
transform.position = PlayerInputManager.Instance.PlayerInputs.Controller.Position.ReadValue<Vector3>();
transform.rotation = PlayerInputManager.Instance.PlayerInputs.Controller.Rotation.ReadValue<Quaternion>();
}
}
This is the code that sets the rotation and position of the controller. I disabled the XR controller and TrackedPoseDriver on the controller gameobject. Am I missing something ?
Also, isn't this basically the same as enabling the "ignore tracking state" bool inside TrackedPoseDriver, which if checked, gave the same result. The only difference if the ignoreTrackingState bool is checked or not is that the controller would go back to origin position and rotation if it is checked, if not the controller would simply stay at the last known position and rotation when your hands move away.
You can view the issue in this video. Its a video with the custom controller script, but it is basically the same without it as well. The controller goes back to origin when the hand moves away. If ignore tracked state was set to false, in the previous method, the controller would simply stay in place of last known position instead of going back to origin when I move my hand away
I believe the issue stems from our internal implementation in the input framework that when the Hand Interaction Profile (XR_EXT_hand_interaction OpenXR extension) is active, this is intended behavior. In that, if neither hand is holding the controller, its tracking is no longer passed through to Unity.
You can get by this by removing the Hand Interaction Profile from the list of OpenXR interaction profiles (see below).
Without that Hand Interaction Profile, the controller should continue to be tracked even when hands aren't close to it - as long as it continues to move as it would eventually go to sleep without activity. The remaining issue, though, is that without Hand Interaction Profile, you wouldn't get hand aim pose, pinch, poke, etc. interaction actions when using XRI rig and actions. In MRTK3, which I don't think you're using, this is mitigated with some fallbacks to use hand keypoints (which you'd still get with the hand tracking subsystem active) that can be used to calculate the interaction poses and states. So something similar could be done, but not trivial if not already using MRTK3.
Depends on what interactions you still need with hands?
Tbh though, it does not make sense for this behavior to happen when hand tracking subsystem is off right ? Simply having the profile enabled shouldn't alter the controller tracking no ? If its being done for efficiency reasons, why still do it when hand tracking is off ?
We are using gesture recognition (pinch, to controller weapon attachment placements) and normal hand position tracking for UI Traversal.
The Hand Tracking Subsystem ( XR_EXT_hand_tracking OpenXR extension) gives you the hand joint data that can be used to drive hand meshes and such. This feature doesn't deal with the hand gesture interactions like pinch and poke that the Hand Interaction Profile provides.
So even if the Hand Tracking Subsystem is inactive, if the Hand Interaction Profile is active, this will cause the underlying OpenXR implementation on ML2 to not pass along Controller tracking unless held by either left or right hand (from what I've been told by the internal team dealing with that).
Although not ideal in your use case, you can keep the Hand Tracking Subsystem active to track hand joints, and potentially use them to drive hand gesture actions (like can be done in MRTK3). That is at least an option I think, though perhaps not trivial.
It may even be possible to include MRTK3 in the project, but continue to use the XRI rig as you have it, and incorporate some of the hand gesture actions that can utilize hand joints as a workaround.
By the way, using the Controller as a mounted tracker like this is very interesting. How do you handle trigger press?
Okay, so if I disable the hand interaction poses feature alone in the XR Plugin management(while leaving Hand Tracking Subsystem active) the controller will be tracked ? If this is the case, I might be able to detect simple gesture like pinch on my own...
We have our own hardware to send over trigger pull via bluetooth to unity and all that works really well, its just the hand interaction that had been causing issues. It still works fairly well since hands are almost always around the mounted tracker, but this is not ideal as people do take away their hand every now and then
Okay, so if I disable the hand interaction poses feature alone in the XR Plugin management(while leaving Hand Tracking Subsystem active) the controller will be tracked ?
Correct, you should get hand joint tracking plus Controller tracking, with the Controller continuously tracking regardless if hands are close by (again, as long as the Controller continues to be active and doesn't go to sleep due to inactivity).
The only trick then would be to use hand joint data to obtain the missing hand gesture interactions, which is definitely doable.