XRIT Does not detect change from Hands to Controllers (ML2)

Our team would like to support as many headsets as possible with as consistent and interaction suite as is feasible. We're looking to use OpenXR and XRIT to achieve this, and want to replicate the same behavior exhibited by Quest devices: Putting Controllers down disables them and activates Hands (if available), and picking them up enables them and disables Hands. However picking up the Magic Leap 2 controller when using the XRIT Hands demo rig results in hands remaining active. The controller still tracks and can interact with interactables, but the associated gameObject in the rig hierarchy is not enabled, therefor not rendered, and cannot use the ray interactor.

I'd like to test some assumptions in the editor while previewing the project on the device, but I've only gotten the app simulator to request I put the editor into playmode and doing so does not render to the headset (nor are tracked devices reflected in the editor). A cursory exploration on these forums is that App Simulator is not supported with OpenXR yet, and if so, are there any alternative methods to test the project on the headset through the editor in the meantime?

Any advice would be greatly appreciated.


Unity Version: 2022.3.21f1
Project Setup: Followed this guide. Also applied hand interaction fix discussed here

Relevant Packages:

  • XR Interaction Subsystems: 2.0.0
  • XR Interaction Toolkit: 2.5.2
  • XR Hands: 1.4.0
  • OpenXR Plugin: 1.10.0
  • Input System: 1.7.0
  • XR Legacy Input Helpers: 2.1.10
  • XR Plugin Management: 4.4.0
  • AR Foundation: 5.1.2
  • Magic Leap SDK (com.magicleap.unitysdk): 2.1.0
  • Magic Leap Setup tool: 2.0.7

OpenXR Interaction Profiles:

  • Magic Leap 2 Controller Interaction Profile
  • Hand Interaction Profile

OpenXR Feature Groups

  • Hand Tracking Subsystem
  • Hand Interaction Poses
  • Palm Pose
  • ALL Magic Leap 2 Features

Steps to reproduce:

  • Import XRIT Hand Interaction Demo via package manager
  • Add Assets/Samples/XR Interaction Toolkit/2.5.2/Hands Interaction Demo/HandsDemoScene.unity to build scenes as the only active scene
  • Build & Run to Magic Leap 2
  • Hand tracking works great (if you've applied the aforementioned changes to the InputActions asset provided by XRIT.
  • Pick up the controller, notice that hands continue to be active & rendered and the controller remains disabled. The controller still tracks and can interact with objects on the table, but the controller gameObject is not enabled and cannot use ray interactor.

Thank you for all of the information. I looked into this and asked the team that does the MRTK 3 integration package for some advice (MRTK 3 supports controller and hand interactions). Here are some notes.

To support both the Controller and Hand Tracking inputs you will need to create a script that mediates between the two types of controllers OR use the isTracked state for each of the inputs.

Regardless of which option you choose, you will need to adjust your input actions so that the Tracking flags do not conflict between tracking devices. Instead of using the generic XRController bindings use (MagicLeapController and HandInteraction ). This is because both the controller and the hand input are mapped to Unity's XR Controller.

<MagicLeapController>{LeftHand}/isTracked
<MagicLeapController>{RightHand}/isTracked

<HandInteraction>{LeftHand}/isTracked
<HandInteraction>{RightHand}/isTracked


I have put in a request for a guide that can better explain this. In the meantime please reach out if you have any questions.