Hello there, I am currently building an application in Unity using UI Poke Components from the XR Hands Demo Scene, facing frequent accuracy issues in gesture recognition. Pokes are not recognized in 10-20% of cases (see video).
Are there any ways to reliably improve gesture recognition in this case (any additional settings, custom button implementation, etc.)?
Or can I potentially use MLSDK Hand Tracking along with OpenXR for all the other features?
I have tried the scene in two different setups, also on two different ML2 devices, and the accuracy issues are persistent for all setups:
Setup A (latest):
Unity Editor 6.1
ML2 OS 1.12
MLSDK v2.6
XR Interaction Toolkit 3.1.2
XR Hands v1.6.1
Setup B:
Unity Editor 2022.3.11f1
ML2 OS 1.12
MLSDK v2.3
XR Interaction Toolkit 3.0.5
XR Hands v1.4.1
I have tried:
- Adjusting values in XRI Input Action bindings for pinch values for both hands (0.5 or 0.9 instead of 0.75 from the guide)
- Disabling Near-Far Interactor components for both hands in the Left and Right Hand game objects in the demo scene to keep the far interactor from overriding the pokes
- Adjusting the buttons’ position and rotation in the 3d scene for a better poke angle (UI 15 /30/90 degrees incline; below the hand/at the eye level)
None of it improved the poke accuracy. The grab interaction also works semi-good, with grabs being false-triggered with a half-open palm or not releasing the grabbed object.