Hi Mika,
Here is a bit more information for each of your questions-
Btw, as Krystian said MLSDK v1.8 didn't ship with Windows libs, so building with Unreal will fail atm. This should be fixed for the next release. Currently, the only plugin in the UE sdk that depends on ML (non-openxr) apis is Voice. If you don't need to use Voice intents in your application or need to call other ML C apis in your own code, then you wouldn't need to use ML SDK anyways.
-
What is the issue that you are having attempting to open the examples project? If you have compiled the UE editor from the MagicLeap fork of the UE source from the tip of the 5.4-release-ml2 branch, with the sdk source from the v1.5 package, then you should be able to open and build the project. Note that you will get a prompt when you open the project for the first time saying that it was created using a different version of the UE editor and asking if you would like to create a copy. That is expected as you are using a local build of the UE editor to open the project. If you accept the prompt, then the project should open.
-
I'm sorry for the confusion on this. We'll update the documentation soon. As Krystian said, there was a MagicLeapController component included in an older version of the UE sdk. It was a thin wrapper on the common MotionController component that is included out of the box. You should be able to use that instead-
https://dev.epicgames.com/documentation/en-us/unreal-engine/motion-controller-component-setup-in-unreal-engine
-
Theoretically, you should be able to use the XRDeviceVisualization to render a model of the ML2 controller, as surfaced by the OpenXR runtime-
https://dev.epicgames.com/documentation/en-us/unreal-engine/unreal-engine-5.2-release-notes#xrdevicevisualizationcomponent
I've found recently that it seems to work inconsistently in UE 5.4, with any model not just the one surfaced by OpenXR. Will do a bit of investigation here. In the meantime, if you want, you could try attaching a cube or some mesh component to the MotionController instance instead just to have a way to visualize the controller pose-

-
Yes, as Krystian said, hand tracking, including joint articulation and gesture detection are available, both when building an app to run natively on ML2 and using remote rendering.
Tracking & joint articulation-
Basic hand and joint tracking is available through Epic's own OpenXRHandTracking plugin-
You can use the "Render Motion Controller" function in Epic's XRVisualization function library to render a simple visualization of the hand joints-
Example blueprint to render left & right hand joints-
This tutorial goes into some more detail-
https://developer.vive.com/resources/openxr/openxr-pcvr/tutorials/unreal-engine/integrate-hand-tracking-data-your-hand-model/
Gesture detection-
As Krystian mentioned, gesture detection (as surfaced by the XR_EXT_hand_interaction OpenXR extension) is exposed through the MagicLeap Hand Interaction plugin-
After you enable the plugin, navigate to the Magic Leap Hand Interaction settings page in Project Settings to select which gestures you would like to detect. Note that no gestures are selected by default-
The plugin exposes a few blueprint functions that you can use to read the input state. Here is an example blueprint that will print the pinch value when detected, as in the screenshot above-
Note- the ML remote render view application already requests permission for hand tracking, but if you want to run your app natively on ML2, you'll need to request the com.magicleap.permission.HAND_TRACKING permission-
- Not all MagicLeap Unreal plugins in the sdk are supported by MagicLeap remote rendering right now, but several of them are-
Supported by remote render- Controller, HandInteraction, LightEstimation, LocalizationMap, MarkerTracking, Planes
Unsupported- PixelSensor, SpatialAnchors, SystemNotifications, UserCalibration, Voice, WorldMesh
Also, note that the MagicLeap remote rendering service only implements OpenXR apis, so you also wouldn't be able to call any ML C apis using remote rendering.
Best,
Adam