UI touch input with hand tracking with XRITK

Is there anyway to do some sort of poke/touch input for UI buttons via hand tracking and some of the XR interaction toolkit stuff?
Basically being able to press a button with the tip of the finger.

Our current solution is to simply have a collider on the button and do a onCollisionEnter but this is not great and does not have any sort real direct touch feelings and was hoping that maybe there is already a poke interaction system working mainly with the XRITK (similar to how it works from the demo scenes with Oculus Quest) ?

I am aware of MRTK but in our project we do not want to use it.

Hi @jc1,

Here is a demo scene from the Unity documentation website.

I hope this helps and don't hesitate to reach out if you have any further questions.

Best,

El

Hi,

I did already have a look at that scene and this demo works fine with a quest device I but have been unable to get it to work with the ml hand tracking and kind of lost on how I would set it up with hand objects. Some help would be greatly appreciated. Thanks

Are you receiving any errors when trying to run it?

Also could you share a few more details:

  • Unity Editor version
  • ML2 OS Version
  • ML SDK Version
  • Development OS (e.g. Mac, Windows)

@etucker,

To get the sample scene working I have to import the XRHand package (v1.2.1) as well as the hand visualizer sample but after I install the package I get all these errors within the ML SDK (see attached screenshots). If I just play the scene without the XRHand then nothing is being rendered in the scene as it does require xr hands.

Unity 2022.3.5f1 LTS (URP)
ML OS: 1.4.1
ML SDK: 1.4.0 (package 1.12.0)
Windows 10 Home
ARFoundation: 5.0.7

I did try all the following XRITK versions: 2.4.3, 2.3.2 and 2.5.2

Hi @jc1,

Did you make sure that you set the correct permissions to use Hand Tracking?

The Hand Tracking API requires the Hand Tracking permissions to be declared in the application's Manifest. To do this Go to Edit > Project Settings > Magic Leap > Manifest Settings and enable Hand_Tracking. For more information, refer to the permissions guide.

Also did you go to the Project Validation window and implement the fixes for XR Interaction Toolkit?

Best,

El

Hi @etucker ,

Yes hand tracking permission is enabled and when I press the fix button in the project validation it does not get resolve and keeps showing error. I also did manually import the assets from the package manager as well.

Did you guys have any example created that showcase all of this perhaps that I can work from as a template?

Hi @jc1,

Unfortunately, at the moment, the hands subsystem package (com.unity.xr.hands) is not supported as it needs OpenXR which is not currently supported in Unity for the ML2. Apologies for suggesting this to you.

Here is a guide for setting up hand tracking on the ML2. Hand Tracking Overview | MagicLeap Developer Documentation

Here is our Unity Learn module. Hand tracking and gestures - Unity Learn

Let me know if you have any further questions.

Best,

El

1 Like