In my packaged build (regardless of it being Shipping/Debug/Development), nothing is rendered while playing unless MobileHDR is on. The game is running just fine as its outputting logs, just nothing on screen. No apparent errors in log.
Sorry about the trouble. Just to confirm, I was able to reproduce this behavior. Our internal team is looking into this. We'll see if we can find a workaround. Just to confirm, can you please give the exact component versions that you were using? I was able to repro with slightly older components, but just to check-
These are the current latest versions-
OS - 1.9.0
UE sdk - 1.6.0
UE - 5.4.2 (latest commit to the 5.4-release-ml2 branch has hash 3334e7950335cebdb5c81fa1e46514baf76c87b2 )
Our team verified that the issue is still present. We'll file a bug on it internally. Thanks for raising this concern!
Also, something to consider- even if disabling the MobileHDR setting worked, you still probably wouldn't really be able to use the capture stream with virtual content composited with the feed from the camera. The noise would go away, but alpha will still be inverted, so the background will appear black and opaque in the stream. It's not ideal, but for a demo the best you can probably do is just to show the stream of the virtual content only.
If you're using the Capture function in ML Hub to show the stream, then you can set Visuals to Virtual Only to hide the camera feed-
If it works for your demo, another option would be to use remote rendering. If your app is running on a Windows pc, you won't have the issue with incoherent alpha and you can use post-processing to invert it. There's a guide on how to do this here-
np! It doesn't require it. You only need to use the ML specific UE branch if you want to package an app to run natively on your ML2. If you don't need the engine source code, you can use official UE 5.4 builds available from Epic's launcher.
This remote rendering guide page has some instructions for how to use the ML UE sdk without building the engine from source. Note that this only works with remote rendering. Also, not all of the MagicLeap plugins will work with remote rendering, but several of them do-
Btw, if it works for your project, I highly recommend setting remote rendering up anyways for Unreal development with ML, even if you're building a native app, as a workflow accelerator. You can use the VR preview in the UE editor to stream to the MagicLeap over remote render, which allows you to test changes instantly as a part of your inner loop, instead of having to build, package, and deploy every time.
You know what, your suggestion to use remote rendering as a part of my pipeline is an amazing idea I am kicking myself for not thinking of. It will make debugging so much easier and faster. You have just saved future me so many hours. If it's not already I'd add that suggestion to the "Getting Started" Unreal Engine section. Thank you for that.
I’m having trouble getting the background in the magic leap stream to be anything but black when using remote rendering. I followed the steps in the documentation, so I’m unsure what else to try.
Been trying the remote rendering feature however there’s some really weird artifacts, with mesh vertices in the headset being in the wrong place, but not on the computer stream. As well as possibly the eyes rendering on the wrong side, super disorienting, unsure how to describe it.
Would there happen to be an alteration to UE5.4 I can make to remedy this?