API to access MR videostream from Magic Leap Hub 3

Give us as much detail as possible regarding the issue you’re experiencing:

Unity Editor version: 2022.3.7f1
ML2 OS version: (latest of march 2025)
Unity SDK version: 1.4.0
Host OS: Windows 11

I was able to share the MR-Video with my WebRTC server using the MLCamera class to generate a RenderTexture that is converted to a VP8 stream. I’m very happy, that this is working so straight forward so far. Now the problem is the quality and the performance. This approach works for a prototype, but is not usable for a shippable product for my customer.

I was wondering if there is an API that allows me to access the stream, that the Magic Leap Hub 3 is able to show. Enabling the stream there, wont inhibit the performance of the Magic Leap 2 as much and also looks much better.

Unfortunately we don’t provide this type of API. However, the Magic Leap 2 is compatible with the Android Camera 2 API, which means that if there is a 3rd party solution that is able to stream the Camera on an Android 10 device then you would be able to use it on the Magic Leap 2 as well.


Depending on your timeline you could take a look at the deprecated WebRTC example that was in the 1.12.0 Unity Example project. The WebRTC APIs are deprecated and no longer being maintained or supported, but they might provide insight on how to stream the video without the initial conversion to a Render Texture.

I’ve already adjusted the code, so the output is not a RenderTexture but a simple Texture2D. This does not really change much unfotunately.

Using this snipped and parameters to get the Texture from the MLCamera (MrcTexture is public and accessed from the WebRTC client to get the texture):

 private void OnCaptureRawVideoFrameAvailable(MLCameraBase.CameraOutput frameInfo, MLCameraBase.ResultExtras resultExtras, MLCameraBase.Metadata metadataHandle)
 {
     if (!_updatingTexture)
     {
         _updatingTexture = true;
         UpdateRGBTexture(frameInfo);
     }
 }

 private void UpdateRGBTexture(MLCamera.CameraOutput output)
 {
     var imagePlane = output.Planes[0];
     var strideInPixels = (int)(imagePlane.Stride / imagePlane.BytesPerPixel);
     var height = (int)imagePlane.Height;
     var width = (int)imagePlane.Width;

     if (MrcTexture == null)
     {
         MrcTexture = new Texture2D(strideInPixels, height, TextureFormat.RGBA32, false)
         {
             filterMode = FilterMode.Bilinear
         };
     }

     MrcTexture.LoadRawTextureData(imagePlane.Data);
     MrcTexture.Apply(false);

     _updatingTexture = false;
 }

 [SerializeField] private RawImage _preview;

 private bool _isCameraInitializationInProgress;
 private bool _isCameraConfiguredAndReady = false;
 private bool _updatingTexture = false;

 private const int TargetWidth = 648;
 private const int TargetHeight = 720;

 private const MLCameraBase.CaptureFrameRate FrameRate = MLCameraBase.CaptureFrameRate._30FPS;
 private const MLCameraBase.OutputFormat OutputFormat = MLCameraBase.OutputFormat.RGBA_8888;
 private const MLCameraBase.Identifier CamId = MLCameraBase.Identifier.Main;
 private const MLCamera.ConnectFlag ConnectFlags = MLCamera.ConnectFlag.MR;

…and this is used to convert the Texture2D into a WebRTC stream (from LiveKit - UnityExample:

        private IEnumerator IPublishCam()
        {
#if UNITY_EDITOR
            var source = new TextureVideoSource(_placeHolder);
#else

            while (_mrcCam.MrcTexture == null)
            {
                yield return new WaitForEndOfFrame();
            }

            var source = new TextureVideoSource(_mrcCam.MrcTexture);
#endif

            var track = LocalVideoTrack.CreateVideoTrack("my-video-track", source, _room);
            var options = new TrackPublishOptions();
            options.VideoCodec = VideoCodec.H264;
            var videoCoding = new VideoEncoding();
            videoCoding.MaxBitrate = 1500000;
            videoCoding.MaxFramerate = 30;
            options.VideoEncoding = videoCoding;
            options.Simulcast = false;
            options.Source = TrackSource.SourceCamera;

            var publish = _room.LocalParticipant.PublishTrack(track, options);
            yield return publish;

            if (!publish.IsError)
            {
                Debug.Log("Track published!");
            }

            source.Start();
            StartCoroutine(source.Update());
            _rtcVideoSources.Add(source);
        }

Are there maybe some parameters that I could adjust to get a better result?

Are there maybe any plans on implementing the Camera2 features from the Android SDK into the MagicLeap SDK to give access to a better performing stream in Unity?

I’m evaluating, if the Magic Leap 2 is a contender for our solution, since Microsoft has recently buried the HoloLens program. A stable WebRTC stream is one of the most important aspects of our application.

Since the camera can be accessed like a regular Android camera, even supporting WebCameraTexture, You may be able to raise the feedback for more performant streaming by posting on the Unity WebRTC Package github project - Unity would need to make these changes or add support for direct Camera 2 streaming or some way to skip rendering to a Texture first.

As mentioned previously, you might be able to see the webrtc example from the Magic Leap 1.12.0 Unity Examples Project, however that example relies on deprecated APIs that were specific to Magic Leap and was used before Magic Leap 2 supported the regular Camera2 API.