Questions Regarding Magic Leap 2 and Unreal SDK

I have a couple of questions regarding the Magic Leap 2 Unreal 2 SDK.

I have built Unreal from the source code (version 5.4.2), and I am using Unreal SDK 1.5 ("Release 1.5.0 is set up to work with the 5.4-release-ml2 branch, which is the default branch"). I tried using ML C SDK 1.8, but it failed, so I used version 1.7 instead. The build was successful with 1.7. ML2 OS is 1.8.0

Here are my questions and issues:

  1. Unreal Examples: Which version of Unreal should I use? I am not able to open the examples because of a version mismatch (different Unreal versions cause it to fail).
  2. Controller: I have followed your manual, but I am unable to add the MagicLeapController in VRPawn's Components. The Add button does not provide the controller option, even though I have enabled the ML plugins.
  3. Enhanced Input Action: When I add an Enhanced Input action for the controller, the visualization of the controller (hand) disappears, although it works by default. The trigger action worked successfully. Is this expected behavior?
  4. Hands: Should the hands feature work? Can't really find anything. Are there any examples or tutorials available for this? Also with Remote Rendering?
  5. Remote Rendering: Which OpenXR-based options should be working for remote rendering? Marker tracking? Planes?

Regards,
Mika

Welcome to the developer forum and congratulations on your first post.

Just writing to let you know that @aheaney is tracking down information and might be able to provide additional details about some of these, but I will try to answer a few:

Regarding the Unreal Engine and Unreal Plugin version 1.5.0:
It appears that there is a bug that caused the win directory to be omitted from the MLSDK lib folder. I have reported it to our engineering team.

If you plan to use the Magic Leap SDK APIs in addition to the OpenXR APIs, you will need to perform the following steps:

  1. Make sure you have the 1.7.0 version of the Magic Leap SDK . Copy the MagicLeap\mlsdk\v1.7.0\lib\win folder into the MagicLeap\mlsdk\v1.8.0\lib\ directory before building the engine.

Note: you can also remain on the previous version as no new MLSDK APIs have been added.

Regarding the Hand Input:

Hand-tracking keypoints are supported through OpenXR extension XR_EXT_hand_tracking. Gesture detection is supported through OpenXR extension XR_EXT_hand_interaction.

Make sure the associated Plugins are enabled in Unreal Engine. Both remote render and on device apps support hand tracking.

Regarding the Controller:
I also noticed that our Unreal Controller Input documentation needs to be updated.

See the previous post regarding the MagicLeapController:

Regarding Supported Extensions:

Here is the list of the OpenXR extensions that we officially support in Magic Leap remote Render

https://github.khronos.org/OpenXR-Inventory/extension_support.html#magicleap_remote_rendering

Thanks for the reply. It's very good to know that hand tracking is supported in remote rendering. However, I'm having a hard time getting it to work. The ML Hand Tracking plugin is enabled, etc. I am trying to use it in Blueprint. Should I look at the C++ side? Any hints on how to get hand tracking working would be very helpful.

Looking forward additions details.

Hi Mika,

Here is a bit more information for each of your questions-

Btw, as Krystian said MLSDK v1.8 didn't ship with Windows libs, so building with Unreal will fail atm. This should be fixed for the next release. Currently, the only plugin in the UE sdk that depends on ML (non-openxr) apis is Voice. If you don't need to use Voice intents in your application or need to call other ML C apis in your own code, then you wouldn't need to use ML SDK anyways.

  1. What is the issue that you are having attempting to open the examples project? If you have compiled the UE editor from the MagicLeap fork of the UE source from the tip of the 5.4-release-ml2 branch, with the sdk source from the v1.5 package, then you should be able to open and build the project. Note that you will get a prompt when you open the project for the first time saying that it was created using a different version of the UE editor and asking if you would like to create a copy. That is expected as you are using a local build of the UE editor to open the project. If you accept the prompt, then the project should open.

  2. I'm sorry for the confusion on this. We'll update the documentation soon. As Krystian said, there was a MagicLeapController component included in an older version of the UE sdk. It was a thin wrapper on the common MotionController component that is included out of the box. You should be able to use that instead-
    https://dev.epicgames.com/documentation/en-us/unreal-engine/motion-controller-component-setup-in-unreal-engine

  3. Theoretically, you should be able to use the XRDeviceVisualization to render a model of the ML2 controller, as surfaced by the OpenXR runtime-
    https://dev.epicgames.com/documentation/en-us/unreal-engine/unreal-engine-5.2-release-notes#xrdevicevisualizationcomponent
    I've found recently that it seems to work inconsistently in UE 5.4, with any model not just the one surfaced by OpenXR. Will do a bit of investigation here. In the meantime, if you want, you could try attaching a cube or some mesh component to the MotionController instance instead just to have a way to visualize the controller pose-
    image

  4. Yes, as Krystian said, hand tracking, including joint articulation and gesture detection are available, both when building an app to run natively on ML2 and using remote rendering.

Tracking & joint articulation-

Basic hand and joint tracking is available through Epic's own OpenXRHandTracking plugin-

You can use the "Render Motion Controller" function in Epic's XRVisualization function library to render a simple visualization of the hand joints-

Example blueprint to render left & right hand joints-


This tutorial goes into some more detail-
https://developer.vive.com/resources/openxr/openxr-pcvr/tutorials/unreal-engine/integrate-hand-tracking-data-your-hand-model/

Gesture detection-

As Krystian mentioned, gesture detection (as surfaced by the XR_EXT_hand_interaction OpenXR extension) is exposed through the MagicLeap Hand Interaction plugin-

After you enable the plugin, navigate to the Magic Leap Hand Interaction settings page in Project Settings to select which gestures you would like to detect. Note that no gestures are selected by default-

The plugin exposes a few blueprint functions that you can use to read the input state. Here is an example blueprint that will print the pinch value when detected, as in the screenshot above-

Note- the ML remote render view application already requests permission for hand tracking, but if you want to run your app natively on ML2, you'll need to request the com.magicleap.permission.HAND_TRACKING permission-

  1. Not all MagicLeap Unreal plugins in the sdk are supported by MagicLeap remote rendering right now, but several of them are-

Supported by remote render- Controller, HandInteraction, LightEstimation, LocalizationMap, MarkerTracking, Planes

Unsupported- PixelSensor, SpatialAnchors, SystemNotifications, UserCalibration, Voice, WorldMesh

Also, note that the MagicLeap remote rendering service only implements OpenXR apis, so you also wouldn't be able to call any ML C apis using remote rendering.

Best,
Adam

2 Likes

This topic was automatically closed 15 days after the last reply. New replies are no longer allowed.