Request for Help in Capturing a Wider Field of View Using ML2

Hello Community,

I am currently working on a project where I need to capture and record the real-world using my Magic Leap 2 (ML2) device. Currently, I'm using the RGB camera functionality, but I find that it only captures the view directly in front of the camera.

Here is my issue: when capturing video using my Magic Leap device, I'm only able to capture a limited view, i.e., the field of view directly in front of the device. What I'm looking for is a way to capture a larger, wider view.

Could it be possible to achieve this by combining the usage of both the RGB and the Depth Camera at the same time? If so, I would appreciate any guidance on how to do this. If not, what other alternatives do I have to increase the captured field of view?

Here are the specifics of my setup:

Unity Editor version: 2022-2.0f1
ML2 OS version: Version 1.3.0-dev1, Build B3E.230427.10-R.023, Android API Level 29
MLSDK version: 1.7.0
Host OS: Windows 11

As of now, I haven't encountered any explicit error messages. However, I'm not getting the expected results in terms of the field of view coverage.

I am getting this view when I am capturing using Magic Leap Hub "Advance Capture":

Thank you in advance for any help you can provide.

Best Regards,

I have recorded the following couple of videos using RGB Camera functionality; I am sharing my source code here;

using System;
using System.Collections;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading.Tasks;
using MagicLeap.Core;
using UnityEngine;
using UnityEngine.UI;
using UnityEngine.XR.MagicLeap;


namespace MagicLeap.Examples
{

    /// <summary>
    /// This class handles video recording and image capturing based on controller
    /// input.
    /// </summary>
    /// 


    public class TestCameraRecording : MonoBehaviour
    {
        private MLCamera.CaptureFrameRate FrameRate = MLCamera.CaptureFrameRate._30FPS;
        private MLCamera.OutputFormat OutputFormat = MLCamera.OutputFormat.RGBA_8888;
        private MLCamera captureCamera;
        private bool isCapturingVideo = false;

        [SerializeField, Tooltip("Button that starts the Capture")]
        private Button recordButton;
        private bool skipFrame = false;


        [SerializeField, Tooltip("Reference to media player behavior used in camera capture playback")]
        private MLMediaPlayerBehavior mediaPlayerBehavior;

        private readonly CameraRecorder cameraRecorder = new CameraRecorder();


        private const string validFileFormat = ".mp4";

        private bool isCapturingPreview = false;
        private bool RecordToFile = true;

        private string recordedFilePath;
        private MLCamera.CaptureType CaptureType = MLCamera.CaptureType.Video;


        private List<MLCamera.StreamCapability> streamCapabilities;

        [SerializeField, Tooltip("Button that starts the Capture")]
        private readonly MLPermissions.Callbacks permissionCallbacks = new MLPermissions.Callbacks();

        private bool cameraDeviceAvailable;

        [SerializeField, Tooltip("Refrence to the Raw Video Capture Visualizer gameobject for YUV frames")]
        private CameraCaptureVisualizer cameraCaptureVisualizer = null;


        private void Awake()
        {
            permissionCallbacks.OnPermissionGranted += OnPermissionGranted;
            permissionCallbacks.OnPermissionDenied += OnPermissionDenied;
            permissionCallbacks.OnPermissionDeniedAndDontAskAgain += OnPermissionDenied;

            //connectionFlagDropdown.AddOptions(
            //    MLCamera.ConnectFlag.CamOnly,
            //    MLCamera.ConnectFlag.MR,
            //    MLCamera.ConnectFlag.VirtualOnly);

            recordButton.onClick.AddListener(StartVideoCapture);
            //connectButton.onClick.AddListener(ConnectCamera);
            //disconnectButton.onClick.AddListener(DisconnectCamera);
            //connectionFlagDropdown.onValueChanged.AddListener(v => RefreshUI());
            //streamCapabilitiesDropdown.onValueChanged.AddListener(v => RefreshUI());
            //qualityDropDown.onValueChanged.AddListener(v => RefreshUI());
            //captureTypeDropDown.onValueChanged.AddListener(v => RefreshUI());
            //frameRateDropDown.onValueChanged.AddListener(v => RefreshUI());

            //RefreshUI();
        }

        // Start is called before the first frame update

        private void Start()
        {
            Debug.Log("Start");
            MLPermissions.RequestPermission(MLPermission.Camera, permissionCallbacks);
            MLPermissions.RequestPermission(MLPermission.RecordAudio, permissionCallbacks);

            TryEnableMLCamera();
        }

        private void TryEnableMLCamera()
        {
            if (!MLPermissions.CheckPermission(MLPermission.Camera).IsOk)
                return;

            StartCoroutine(EnableMLCamera());
        }

        private IEnumerator EnableMLCamera()
        {
            while (!cameraDeviceAvailable)
            {
                MLResult result =
                    MLCamera.GetDeviceAvailabilityStatus(MLCamera.Identifier.Main, out cameraDeviceAvailable);
                if (!(result.IsOk && cameraDeviceAvailable))
                {
                    // Wait until camera device is available
                    yield return new WaitForSeconds(1.0f);
                }
                else
                {
                    ConnectCamera();

                    // Camera device is available, start video capture here
                }
            }

            Debug.Log("Camera device available");
        }

        // Update is called once per frame
        void Update()
        {
        }


        private void OnPermissionDenied(string permission)
        {
            if (permission == MLPermission.Camera)
            {
                MLPluginLog.Error($"{permission} denied, example won't function.");
            }
            else if (permission == MLPermission.RecordAudio)
            {
                MLPluginLog.Error($"{permission} denied, audio wont be recorded in the file.");
            }

            //RefreshUI();
        }



        private void OnPermissionGranted(string permission)
        {
            MLPluginLog.Debug($"Granted {permission}.");
            TryEnableMLCamera();

            //RefreshUI();
        }

        private void StartVideoCapture()
        {
            // recordedFilePath = string.Empty;
            // skipFrame = false;

            var result = MLPermissions.CheckPermission(MLPermission.Camera);
            MLResult.DidNativeCallSucceed(result.Result, nameof(MLPermissions.RequestPermission));
            Debug.Log($"CLPermissions.CheckPermission {result}");
            if (!result.IsOk)
            {
                Debug.LogError($"{MLPermission.Camera} permission denied. Video will not be recorded.");
                return;
            }

            if (!RecordToFile)
                StartRecording();
            else
                StartPreview();
        }

        private void StartRecording()
        {
            // media player not supported in Magic Leap App Simulator
#if !UNITY_EDITOR
            mediaPlayerBehavior.MediaPlayer.OnPrepared += MediaPlayerOnOnPrepared;
            mediaPlayerBehavior.MediaPlayer.OnCompletion += MediaPlayerOnCompletion;
#endif
            string fileName = DateTime.Now.ToString("MM_dd_yyyy__HH_mm_ss") + validFileFormat;
            recordedFilePath = System.IO.Path.Combine(Application.persistentDataPath, fileName);

            CameraRecorderConfig config = CameraRecorderConfig.CreateDefault();
            config.Width = streamCapabilities[0].Width;
            config.Height = streamCapabilities[0].Height;
            config.FrameRate = MapFrameRate(MLCamera.CaptureFrameRate._30FPS);

            cameraRecorder.StartRecording(recordedFilePath, config);

            int MapFrameRate(MLCamera.CaptureFrameRate frameRate)
            {
                switch (frameRate)
                {
                    case MLCamera.CaptureFrameRate.None: return 0;
                    case MLCamera.CaptureFrameRate._15FPS: return 15;
                    case MLCamera.CaptureFrameRate._30FPS: return 30;
                    case MLCamera.CaptureFrameRate._60FPS: return 60;
                    default: return 0;
                }
            }

            MLCamera.CaptureConfig captureConfig = new MLCamera.CaptureConfig();
            captureConfig.CaptureFrameRate = FrameRate;
            captureConfig.StreamConfigs = new MLCamera.CaptureStreamConfig[0];
            captureConfig.StreamConfigs[0] = MLCamera.CaptureStreamConfig.Create(streamCapabilities[0], OutputFormat);
            captureConfig.StreamConfigs[0].Surface = cameraRecorder.MediaRecorder.InputSurface;

            MLResult result = captureCamera.PrepareCapture(captureConfig, out MLCamera.Metadata _);

            Debug.Log($"Check Camera is ready for capture {MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.PrepareCapture))}");

            if (MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.PrepareCapture)))
            {
                captureCamera.PreCaptureAEAWB();

                if (CaptureType == MLCamera.CaptureType.Video)
                {
                    result = captureCamera.CaptureVideoStart();
                    isCapturingVideo = MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.CaptureVideoStart));

                    Debug.LogError($"isCapturingVideo {isCapturingVideo} ");

                    if (isCapturingVideo)
                    {
                        cameraCaptureVisualizer.DisplayCapture(captureConfig.StreamConfigs[0].OutputFormat, RecordToFile);
                    }
                }

                if (CaptureType == MLCamera.CaptureType.Preview)
                {
                    result = captureCamera.CapturePreviewStart();
                    isCapturingPreview = MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.CapturePreviewStart));
                    if (isCapturingPreview)
                    {
                        cameraCaptureVisualizer.DisplayPreviewCapture(captureCamera.PreviewTexture, RecordToFile);
                    }
                }
            }
        }
        private void StartPreview()
        {
            MLCamera.CaptureConfig captureConfig = new MLCamera.CaptureConfig();
            captureConfig.CaptureFrameRate = MLCamera.CaptureFrameRate._30FPS;
            captureConfig.StreamConfigs = new MLCamera.CaptureStreamConfig[0];
            captureConfig.StreamConfigs[0] =
                MLCamera.CaptureStreamConfig.Create(streamCapabilities[0], OutputFormat);

            MLResult result = captureCamera.PrepareCapture(captureConfig, out MLCamera.Metadata _);

            if (MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.PrepareCapture)))
            {
                captureCamera.PreCaptureAEAWB();

                if (CaptureType == MLCamera.CaptureType.Video)
                {
                    result = captureCamera.CaptureVideoStart();
                    isCapturingVideo = MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.CaptureVideoStart));

                    Debug.LogError($"isCapturingVideo {isCapturingVideo} ");
                    if (isCapturingVideo)
                    {
                        cameraCaptureVisualizer.DisplayCapture(captureConfig.StreamConfigs[0].OutputFormat, true);
                    }
                }

                if (CaptureType == MLCamera.CaptureType.Preview)
                {
                    result = captureCamera.CapturePreviewStart();
                    isCapturingPreview = MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.CapturePreviewStart));
                    Debug.LogError($"isCapturingPreview {isCapturingPreview} ");
                    if (isCapturingPreview)
                    {

                        cameraCaptureVisualizer.DisplayPreviewCapture(captureCamera.PreviewTexture, true);
                    }
                }
            }
        }

        private void ConnectCamera()
        {
            MLCamera.ConnectContext context = MLCamera.ConnectContext.Create();
            context.Flags = MLCamera.ConnectFlag.CamOnly;
            context.EnableVideoStabilization = true;

            if (context.Flags != MLCamera.ConnectFlag.CamOnly)
            {
                context.MixedRealityConnectInfo = MLCamera.MRConnectInfo.Create();
                context.MixedRealityConnectInfo.MRQuality = MLCamera.MRQuality._960x720;
                context.MixedRealityConnectInfo.MRBlendType = MLCamera.MRBlendType.Additive;
                context.MixedRealityConnectInfo.FrameRate = MLCamera.CaptureFrameRate._30FPS;
            }

            captureCamera = MLCamera.CreateAndConnect(context);

            if (captureCamera != null)
            {
                Debug.Log("Camera device connected");
                if (GetImageStreamCapabilities())
                {
                    ShowToast("Camera device connected");
                    Debug.Log("Camera device received stream caps");
                    captureCamera.OnRawVideoFrameAvailable += OnCaptureRawVideoFrameAvailable;
                    captureCamera.OnRawImageAvailable += OnCaptureRawImageComplete;

                }
            }

        }

        private bool GetImageStreamCapabilities()
        {
            var result =
                captureCamera.GetStreamCapabilities(out MLCamera.StreamCapabilitiesInfo[] streamCapabilitiesInfo);

            if (!result.IsOk)
            {
                Debug.Log("Could not get Stream capabilities Info.");
                return false;
            }

            streamCapabilities = new List<MLCamera.StreamCapability>();

            for (int i = 0; i < streamCapabilitiesInfo.Length; i++)
            {
                foreach (var streamCap in streamCapabilitiesInfo[i].StreamCapabilities)
                {
                    streamCapabilities.Add(streamCap);
                }
            }

            return streamCapabilities.Count > 0;
        }

        private void MediaPlayerOnOnPrepared(MLMedia.Player mediaplayer)
        {
            // media player not supported in Magic Leap App Simulator
#if !UNITY_EDITOR
            mediaPlayerBehavior.Play();
#endif
        }

        private void MediaPlayerOnCompletion(MLMedia.Player mediaplayer)
        {
            // media player not supported in Magic Leap App Simulator
#if !UNITY_EDITOR
            mediaPlayerBehavior.StopMLMediaPlayer();
#endif
            mediaPlayerBehavior.gameObject.SetActive(false);
            mediaPlayerBehavior.Reset();
        }

        private void OnCaptureRawVideoFrameAvailable(MLCamera.CameraOutput capturedFrame, MLCamera.ResultExtras resultExtras, MLCamera.Metadata metadataHandle)
        {
            // if (string.IsNullOrEmpty(captureInfoText.text) && isCapturingVideo)
            // {
            // captureInfoText.text = capturedFrame.ToString();
            // }



            if (OutputFormat == MLCamera.OutputFormat.RGBA_8888 && FrameRate == MLCamera.CaptureFrameRate._30FPS && streamCapabilities[0].Width >= 4096)
            {
                // cameraCaptureVisualizer cannot handle throughput of RGBA_8888 4096x3072 at 30 fps 
                skipFrame = !skipFrame;
                if (skipFrame)
                {
                    return;
                }
            }
            cameraCaptureVisualizer.OnCaptureDataReceived(resultExtras, capturedFrame);
        }

        /// <summary>
        /// Handles the event of a new image getting captured.
        /// </summary>
        /// <param name="capturedImage">Captured frame.</param>
        /// <param name="resultExtras">Results Extras.</param>
        private void OnCaptureRawImageComplete(MLCamera.CameraOutput capturedImage, MLCamera.ResultExtras resultExtras, MLCamera.Metadata metadataHandle)
        {

            // isDisplayingImage = true;
            cameraCaptureVisualizer.OnCaptureDataReceived(resultExtras, capturedImage);

            if (RecordToFile)
            {
                if (capturedImage.Format != MLCamera.OutputFormat.YUV_420_888)
                {
                    string fileName = DateTime.Now.ToString("MM_dd_yyyy__HH_mm_ss") + ".jpg";
                    recordedFilePath = System.IO.Path.Combine(Application.persistentDataPath, fileName);
                    try
                    {
                        File.WriteAllBytes(recordedFilePath, capturedImage.Planes[0].Data);
                        // captureInfoText.text += $"\nSaved to {recordedFilePath}";
                    }
                    catch (Exception e)
                    {
                        Debug.LogError(e.Message);
                    }
                }
            }
        }
        public void ShowToast(string message)
        {
            if (Application.platform == RuntimePlatform.Android)
            {
                // Retrieve the UnityPlayer class
                AndroidJavaClass unityPlayerClass = new AndroidJavaClass("com.unity3d.player.UnityPlayer");

                // Retrieve the current activity from the UnityPlayer class
                AndroidJavaObject currentActivity = unityPlayerClass.GetStatic<AndroidJavaObject>("currentActivity");

                // Show the toast message
                currentActivity.Call("runOnUiThread", new AndroidJavaRunnable(() =>
                {
                    // Retrieve the Toast class
                    AndroidJavaClass toastClass = new AndroidJavaClass("android.widget.Toast");

                    // Create the Toast object
                    AndroidJavaObject toastObject = toastClass.CallStatic<AndroidJavaObject>("makeText", currentActivity, message, 0);

                    // Show the Toast
                    toastObject.Call("show");
                }));
            }
            else
            {
                Debug.Log("Toast message: " + message);
            }
        }


    }
}

Here is a reference recorded video using my script:

Here is the reference recorded video using Magic Leap hub "Advanced Capture":

@usman.bashir You cannot increase the field of view beyond what is provided. The field of view is based on the optics of the physical camera on the front of the Magic Leap device. When capturing virtual or mixed reality content, the field of view is can be slightly decreased to more closely match the aspect ratio of the Magic Leap display. This prevents virtual content from being clipped on the edges of the Mixed Reality capture.

Depending on your use case you might be able to use the World Camera for CV operations. The world cameras are black and white and cannot capture Mixed Reality or virtual content. See the World camera example in the Magic Leap Examples project.

Note: The world camera API is experimental and may change in the future which could break backwards or forward capability with other OS versions.