Issue with Transparent Occluding Materials Recording as Black in Magic Leap Remote Rendering

Give us as much detail as possible regarding the issue you're experiencing:

Magic Leap SDK version: 1.6.0
Unreal Engine version: UE5.3
ML2 OS version: 1.6.0

I've been working with remote rendering on Magic Leap2 to achieve higher quality AR content recordings. Following the official documentation, I managed to enhance the clarity of the AR content; however, I'm facing a issue where materials that are both transparent and occluding are being recorded as completely black.

I have set up the 'Allow through tonemapper' and adjusted the post-process volume settings as outlined in the Magic Leap remote rendering documentation.
how to create materials that are both transparent and capable of occlusion when recorded in this environment?

Any help or guidance would be greatly appreciated!

Thank you!


I would be curious to learn a bit more about what you'd like to do for your application. There are currently some limitations with how environment blending works in Unreal w/ MagicLeap as you mentioned. It sounds like maybe what you want is to have an opaque material that you can use to render an opaque black mesh that can be used to make regions of content appear invisible in the ML2 display (with the dimmer layer in the ML2 disabled), and also appear invisible in videos captured on the device. Is that right? That may be challenging to achieve, although hopefully we can find a solution.

Note that the ML2 display has both an rgb waveguide and a 'dimmer' layer. Although the rgb display is purely additive, you do have the ability to darken regions of the ML2 display on a per pixel basis using the dimmer (although I believe the resolution on the dimmer layer is a lot coarser than rgb). If you record video or look at the device stream in ML hub, you'll see an image composited with the feed from the rgb camera in the center of the front of your ML2. If you are not using the dimmer layer in the display, then you may see dark regions appear in the video feed that you don't see in the ML2 display. Setting up environment blending in Unreal is a little tricky right now as the alpha values end up inverted and has caveats as you probably saw on the remote render setup article, although it does work with remote render. However, you should see wholly transparent materials render as such in the composite. (although then they would no longer appear to occlude content in the ML2 display)


Thanks for your kind reply. I think I wrote the question a little wrong at first.
I need a material that can still occludes objects behind it when recording video in a remote rendering.
As of now, a material that would normally be used for occlusion is recorded as black in remote rendering.


Rendering an opaque black material is a reasonable technique for efficiently occluding content when using a device with an additive display. However, it will show up if you create a capture of the virtual content.

You may want to experiment with using the AlphaHoldout blending mode in Unreal which will allow you to use a mesh to 'hold out' the alpha of the object behind it. That should achieve a similar effect and should appear transparent in image captures, although it will be more computationally expensive-

Hope that helps! I can do a bit of experimentation as well and get back to you.