Hello fellow reader,
We have a few questions that we would like to ask regarding the FMETP Remote Desktop plugin in Unity. We managed to get it working on the device connecting from Magic Leap 2 to PC. After the successful connection we are able to stream the remote desktop, however, we are not able to control it with our ML2 device controller. FMETP already has implemented a raycast solution for oculus quest 2, but this version is not working with ML2.
We would like to have the ML controller ray to be the FMETP "cursor" so that full remote desktop control is possible, however our beam within the scene just seems to not register the FMETP panel (+Desktop Viewer) and shoots straight through it. We would like to know if anyone has faced such an issue before and if yes, what is causing the problem ?
We currently suspect that it might has something to do with the Material of the panel (Quad meshing) but we aren't completely sure.
Thank for reading this, we hope to come to a solution together !
Unity Editor version: 2022.2.3f1
ML2 OS version: 1.2.0
MLSDK version: 1.7.0
Host OS: (Windows/MacOS)
@a.reality I am not familiar with the structure of the project for this plugin, but have you tried replacing the Main Camera object in the project with the Magic Leap 2 XR Rig prefab? This XR Rig is configured for AR Interaction and contains the controller.
I also recommend looking at the Plane Raycast guide as well as the Unity Examples to see how raycasting and interaction is handled in those scenes. The Unity XR Interaction Toolkit may be useful to you here.
@a.reality also make sure that the physics raycaster is enabled on your controller and the object that you're trying to interact with has a collider attached to it.
@kvlasova thank you for your reply. We already have the ML2 XR Rig Prefab in our scene that contains the controller and is configured for AR Interaction, however, it is not working on the FMETP-Desktop-Viewer. Controlling other UI Elements is working fine.
We also compared our scene with the Unity Examples, but we have not found any difference yet. Having the Tracked Device Gaphic Raycaster on the canvas and the XR Ray Interactor on the controller should be enough or are we missing anything?
@kvlasova Yes, we have a collider attatched to the object, but where can you enable the phyisics raycaster on the contoller?
Can you try adding the
XRSimpleInteractible script to the object? You can then add hover and select actions to it. I am not familiar with the structure of the FMETP-Desktop-Viewer object but if it has a mesh filter, mesh renderer and a collider, the
XRSimpleInteractible script from Unity will allow the controller to interact with it. You can also try using other XR Interaction Toolkit AR Interaction scripts that would fit your use case.