WebXR implementation issues - Hands, controller and trigger

I'm working on a Unity WebXR Export package, and some games using it.
A user tested it on the ML2 Browser and had some issues related to triggering select and selectstart events when the user hands are tracked.

This is the package I'm working on GitHub - De-Panther/unity-webxr-export: Develop and export WebXR experiences using Unity WebGL
And this is the game with the issue https://worldsdemolisher.totalviz.com/

When using the controller, the user can click on buttons, but only when the user hands are not in view.
Once the user hands are in view, the trigger action stops working.
Also hands pinch gesture doesn't work.

Something I wasn't expecting for was that both the hands and the controller works at the same time.
I'm not sure that it's a valid thing according to the WebXR API, I was looking at the specs and couldn't find related info about such a scenario.
There is a suggestion for it in the WebXR Hand Input Module repo Expose hands and controllers at the same time · Issue #120 · immersive-web/webxr-hand-input · GitHub

I don't have a device to test, so I can't verify if it's an implementation issue in WebXR Export, in the game or in the ML2 browser.

There's a demo for WebXR Export, where the user should be able to grab some objects Unity WebGL Player | Unity WebXR Export
I wonder if it's possible to grab them when using the ML 2 browser.

Thanks

@De-Panther Thank you for your patience and Welcome to the Magic Leap developer forum.

It might help if I understood how the apps get exported to WebXR. While testing the following samples : WebXR - Samples I can confirm that Input selection works properly (but only tracks the controller by default) . Are you using Unity's Open XR package for the WebXR input? Does this work on other pass through AR devices? Is there a headset you would recommend testing it on?

Thanks for your reply.

Which samples have you tested at WebXR Samples ?
I think that this one Controller State should have the most info, as it works both with Controllers and Hands tracking and display all buttons and axes states.

As I mentioned, I'm working on my own implementation ( WebXR Export ), the OpenXR package doesn't support WebXR.
The demos and games made using my package works on all Meta Quests devices, and it worked on Hololens 2 until Microsoft dropped the support for hands tracking on WebXR.

I'm trying to figure out how Controllers and Hands are implemented on WebXR in ML2, as it's different than other browsers and devices.
Other browsers allows only one of the types at the time, while ML2 allows input from controllers and hands at the same time.
While supporting both may work for tracking, it creates issues with trigger states.

We plan to support dynamic bindings in the next release. This will allow you to bind both hand and controller interactions, as mentioned in this post: Binding hands interaction actions with the controller actions causing issues

This will allow you to bind to the hand interactions or the controller and the runtime will decide which one to provide depending on which hand the controller is held in (left, right, none).

Thanks again,

It's not Unity related issue, but WebXR issue.
The WebXR specs are not suitable for handling both Controllers and Hands at the same time. Which is what seems to be the case on WebXR in ML2.

Can you please share a reference no how Controllers and Hands are implemented on WebXR in ML2?

I wonder if there is an existing WebXR demo that works properly when both Controllers and Hands are in view.
If you could test the Controller State sample and tell me if it recognize the trigger event when both Hands and the Controller are in view, it would help.

In our upcoming release the system will handle the transition for you. All you need is to provide bindings for the hand_interaction profile and controller profile. The runtime will then switch between them based on what's available.

Currently, the controller state example only works with the controller (if the controller was active when the website was opened. And Immersive Session with hands will only work if the controller was disabled when the website was launched.

Let me know if you have any additional questions

So is it an issue on the ML2 browser implementation?
Or is it an intended behavior?

I don't have an ML2 device to test it.
I want to confirm if and what should I change on my end.

Thanks

Without changing anything, the application will handle controller input even when hands are visible on the latest OS. Are you using the Hand Interaction profile in Open XR when exporting ? I noticed that the hand input does not seem to work but the hands are meshed properly.

Great. I wonder if it's the issue there, but we'll know once there's the update.

Please ignore Unity related stuff, and think of it as just another web framework. The package is not related to Unity's Open XR package.
It hooks to the WebXR API using JavaScript/emscripten interops.
WebXR has the WebXR Input Profiles repository, which has schemes and models for hands and controllers of different devices. And the WebXR API reveals the input profile name. So I'm using those to display the models.

Thanks