XRI Hand controller vs UI

Hey everyone
Been struggling with this for a while — ML Controller works perfect with it's Ray Interactor and both 3D and UI elements.

Where hand controls provided by XRIHand controller do not interact with UI objects but do interact with other XR Intractables. Any fix available?

Hi @maks,

Thank you for reaching out regarding this issue. I have consulted the team and I will report back as soon as I learn more. Could you please provide a few extra details, though?

  • Unity Editor version
  • ML2 OS Version
  • ML SDK Version
  • Development OS (e.g. Mac, Windows)

Hi @etucker

2022.3.10f1
1.4.1
1.4.0
MacOS

@maks do you mind providing more information on how you are mapping the hand tracking input to the XRI controller?

@kbabilinski
I followed this tutorial

I saw you answering once on the same topic with advice to use MRTK instead, but the XRI solution works so fine except this UI issue. Can we may be find a solution that will make it work?

It's a little hard to tell, does the UI panel react to the highlight input events from the ray when it hovers over the buttons?

If it works, you may need to edit the XRI Hand Controller script to send the additional UI Input events by editing the following section of the code

    bool isPinched = GetPinch();

        controllerState.selectInteractionState.active = isPinched;
        controllerState.selectInteractionState.activatedThisFrame = isPinched && !pinchedLastFrame;
        controllerState.selectInteractionState.deactivatedThisFrame = !isPinched && pinchedLastFrame;

Alternatively , You can use the latest version of the Magic Leap Unity SDK 2.0 and OpenXR to take advantage of the XR Hands and XRI samples. Hands Interaction Demo | XR Interaction Toolkit | 2.5.2

Note, to make the sample work properly on Magic Leap 2, you will need to make sure the Unity Input Asset is updates with the correct bindings. By default the hand input maps to the MetaHand input profile and has to be changed to the HandInteraction paths.

Hey @kbabilinski1

No it does not. But the ray is reacting — it "sees" that it's hovered over something it can interact with.

the section of the code you've mentioned looks identical to what you've suggested

bool isPinched = GetPinch();

    controllerState.selectInteractionState.active = isPinched;
    controllerState.selectInteractionState.activatedThisFrame = isPinched && !pinchedLastFrame;
    controllerState.selectInteractionState.deactivatedThisFrame = !isPinched && pinchedLastFrame;
    pinchedLastFrame = isPinched;

so what should I add?

We will look into this but you may get an answer faster by asking this question on the Unity Forum or Unity Community since they support XRI directly.

In the meantime, I recommend trying our MRTK3 which supports hand interaction across 2D and 3D objects. Including poke interaction.

Actually it started to work after I added this line

controllerState.uiPressInteractionState.active = isPinched;

This topic was automatically closed 15 days after the last reply. New replies are no longer allowed.