Open XR Controller, End of Ray always at same point in space?

I am trying to get started with my ML2, Unity and Open XR. I have a small problem with the controller that I can’t figure out. under my XR origin and camera offset I have the right controller. In the XR controller action based, I have hooked up the bindings for position to pointer/position for magic leap controller (rotation is set to rotation).

I have an XR Ray interactor and a XR interactor Line Visual. When I run my app the close end of the ray is stuck to the controller but the far end of the ray is just stuck at the same point off in space. So as I move the controller around the close end of the ray follows me but the far stays put. So in short it does not act like the controller is supposed to.

Any advice on why this happens? Thank you.

Unity Editor version: 2022.3.62f1
ML2 OS version:
Unity SDK version:
Host OS: (Windows/MacOS) Windows

Error messages from logs (syntax-highlighting is supported via Markdown):

Looks like this was related to rotation not working properly. For some reason ML controller rotation seems to do nothing.

Hey @eric_engineer ,

Glad you got this figured out!

The line renderer in Unity is actually just a 2D ‘billboard’ texture that always faces the camera, so rotating the controller will not appear to rotate the line since it’s trying to face the camera at all times.

Let me know if you need any more help on this!

Best,

Corey

I am struggling hard to make the controller interact with canvas buttons. I spent most of yesterday trying various methods. I even tried to samples provided from the ML HUB. they built but crash immediately on device. But I did go look at the control example and try to follow exactly what they were doing. However whatever I try I can’t make the ray hit the button or activate a hover or a click. I must be missing something. I tried to add more of my setup but “new users can only add one piece of media :slight_smile:

Anyway for my controller I have a tracked pose driver, XR controller (action based), Xray interactor, xr interactor line visual.

For the canvas I have it as worldspace, size and scale modeled after your example from the hub. I have a tracked device graphic raycaster, raycast logger, canvas scaller and grapic raycaster.

Also in scene is a Event system with input system UI input module and XR UI Input module. And I ahve an XR interaction Manager.

I’m suspicious that for some reason the canvas is using my head as the pointer? I only say this because I was able to get hover to activate by looking away to the right sometimes. The controller and ray work fine for 3d objects.

Sorry for the info dump, this is day two of my struggle :slight_smile: I just want to click some buttons so I can take some sensor data.

ignore this my device was at 1.2! Just out of the box, so now at 1.12 the demos work.

Thank you.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.