Highly Simple Remote Rendering on Unity

Hello,

I'm developing a very simple Unity application that coordinates content displayed on a standard monitor with content displayed on the ML2 headset. Simply put, the ML2 headset is only being used as an output device to display content and nothing more.

Based on this thread, @aheaney suggests that such basic rendering is possible in Unity. Would you mind giving me some guidance on how I could achieve this? My guess is that I'll need to write some basic networking protocol to stream the contents of the Unity Display to the ML2 headset, but the developer docs don't seem to have information regarding this.

For further context, up until recently, I've been using the HL2 to develop this application, and this device features a remote rendering feature where you simply need to provide the video encoding parameters, and it forwards the rendered contents of the specified Unity Camera to the HL2 device itself (see image attached below). I'm essentially looking for something very similar to this.

Note: I'm aware that the HL2 feature is only implemented to work in Unity preview / play mode, and not in an application, but the point I'm trying to make is that since it works on a basic level, I should be able to hack it together to work in a fully built executable by replicating the logic of the Unity plug in the HL2 SDK provides. I'm hoping to be able to directly implement the remote rendering I've described above to the fully built app itself.

Thank you for taking your time to read and answer my question! If there's anything unclear in my message, please feel free to reply so I can follow-up.

Cheers,
Monde

Unity Editor version: Currently on 2022f, but I can migrate to whatever other version as required
ML2 OS version: Newly purchased, can update to whichever version required
Host OS: Windows

Error messages from logs (syntax-highlighting is supported via Markdown):

Hi @budmonde,

Welcome to the Magic Leap 2 Developer Forums. We are grateful for your engagement here.

For something rudimentary with Unity, you would have to omit any Magic Leap extensions or the MLSDK. You will only be able to use OpenXR features provided. Basic headpose, rendering, and controller input should work. If you set the OpenXR as the XR provider and build for Windows desktop, that should be good enough. Connection to the HMD is handled as normal by remote render. Run the .exe and you should see the app streamed to the ML2 as long as Remote Render is the active OpenXR runtime.

One issue that you may run into is the dimmer being set to max. If you see an opaque background, you can disable the dimmer or override it using Windows environment variables.

WINDRUNNER_OPAQUE_BLEND_MODE
WINDRUNNER_ENABLE_GLOBAL_DIMMER_OVERRIDE
WINDRUNNER_GLOBAL_DIMMER_VALUE_OVERRIDE

1 Like