Unity NullReferenceException & InvalidParam Error in Camera API Calls

Dea Community,

I'm having trouble with a few aspects of my development, primarily in handling camera operations.

I am attempting to implement a camera preview function in Unity, however, I am consistently encountering a NullReferenceException and an InvalidParam Error during the process.

The issue manifests itself as a NullReferenceException and an InvalidParam error when interacting with the MLCamera API.

The detailed error logs are as follows:

2023/05/28 13:28:06.782 26521 26542 Debug RefBase RefBase: Explicit destruction, weak count = 0 (in 0x71c8605ccd58)
2023/05/28 13:28:06.782 26521 26542 Warn RefBase CallStack::getCurrentInternal not linked, returning null
2023/05/28 13:28:06.782 26521 26542 Warn RefBase CallStack::logStackInternal not linked
2023/05/28 13:28:06.783 26521 26542 Error Unity NullReferenceException: Object reference not set to an instance of an object.
2023/05/28 13:28:06.783 26521 26542 Error Unity   at MagicLeap.Examples.CameraCaptureExample.DisableImageCaptureObject () [0x00000] in <00000000000000000000000000000000>:0 
2023/05/28 13:28:06.783 26521 26542 Error Unity   at MagicLeap.Examples.CameraCaptureExample.OnApplicationPause (System.Boolean isPaused) [0x00000] in <00000000000000000000000000000000>:0 
2023/05/28 13:28:06.783 26521 26542 Error Unity 
2023/05/28 13:28:06.793 26521 26681 Info ml_camera_client Camonly OnAvailable CamId = 0
2023/05/28 13:28:09.214 26521 26698 Info ml_camera_client Camonly OnUnavailable CamId = 0
2023/05/28 13:28:09.215 26521 26542 Info Unity Camera device connected
2023/05/28 13:28:09.215 26521 26542 Info Unity MagicLeap.Examples.TestCameraRecording:ConnectCamera()
2023/05/28 13:28:09.215 26521 26542 Info Unity MagicLeap.Examples.<EnableMLCamera>d__19:MoveNext()
2023/05/28 13:28:09.215 26521 26542 Info Unity UnityEngine.SetupCoroutine:InvokeMoveNext(IEnumerator, IntPtr)
2023/05/28 13:28:09.215 26521 26542 Info Unity MagicLeap.Examples.TestCameraRecording:TryEnableMLCamera()
2023/05/28 13:28:09.215 26521 26542 Info Unity UnityEngine.XR.MagicLeap.<>c__DisplayClass20_0:<OnPermissionGranted>b__0()
2023/05/28 13:28:09.215 26521 26542 Info Unity UnityEngine.XR.MagicLeap.MLPermissions:OnPermissionGranted(String)
2023/05/28 13:28:09.215 26521 26542 Info Unity System.Reflection.RuntimeMethodInfo:Invoke(Object, BindingFlags, Binder, Object[], CultureInfo)
2023/05/28 13:28:09.215 26521 26542 Info Unity System.Reflection.MethodBase:Invoke(Object, Object[])
2023/05/28 13:28:09.215 26521 26542 Info Unity UnityEngine.AndroidJavaProxy:Invoke(String, Object[])
2023/05/28 13:28:09.215 26521 26542 Info Unity UnityEngine.AndroidJavaProxy:Invoke(String, IntPtr)
2023/05/28 13:28:09.215 26521 26542 Info Unity UnityEngine._AndroidJNIHelper:InvokeJavaProxyMethod(AndroidJavaProxy, IntPtr, IntPtr)
2023/05/28 13:28:09.215 26521 26542 Info Unity 
2023/05/28 13:28:09.216 26521 26542 Info Unity Camera device received stream caps
2023/05/28 13:28:09.216 26521 26542 Info Unity MagicLeap.Examples.TestCameraRecording:ConnectCamera()
2023/05/28 13:28:09.216 26521 26542 Info Unity MagicLeap.Examples.<EnableMLCamera>d__19:MoveNext()
2023/05/28 13:28:09.216 26521 26542 Info Unity UnityEngine.SetupCoroutine:InvokeMoveNext(IEnumerator, IntPtr)
2023/05/28 13:28:09.216 26521 26542 Info Unity MagicLeap.Examples.TestCameraRecording:TryEnableMLCamera()
2023/05/28 13:28:09.216 26521 26542 Info Unity UnityEngine.XR.MagicLeap.<>c__DisplayClass20_0:<OnPermissionGranted>b__0()
2023/05/28 13:28:09.216 26521 26542 Info Unity UnityEngine.XR.MagicLeap.MLPermissions:OnPermissionGranted(String)
2023/05/28 13:28:09.216 26521 26542 Info Unity System.Reflection.RuntimeMethodInfo:Invoke(Object, BindingFlags, Binder, Object[], CultureInfo)
2023/05/28 13:28:09.216 26521 26542 Info Unity System.Reflection.MethodBase:Invoke(Object, Object[])
2023/05/28 13:28:09.216 26521 26542 Info Unity UnityEngine.AndroidJavaProxy:Invoke(String, Object[])
2023/05/28 13:28:09.216 26521 26542 Info Unity UnityEngine.AndroidJavaProxy:Invoke(String, IntPtr)
2023/05/28 13:28:09.216 26521 26542 Info Unity UnityEngine._AndroidJNIHelper:InvokeJavaProxyMethod(AndroidJavaProxy, IntPtr, IntPtr)
2023/05/28 13:28:09.216 26521 26542 Info Unity 
2023/05/28 13:28:09.217 26521 26542 Error Unity NullReferenceException: Object reference not set to an instance of an object.
2023/05/28 13:28:09.217 26521 26542 Error Unity MagicLeap.Examples.TestCameraRecording:TryEnableMLCamera()
2023/05/28 13:28:09.217 26521 26542 Error Unity UnityEngine.XR.MagicLeap.<>c__DisplayClass20_0:<OnPermissionGranted>b__0()
2023/05/28 13:28:09.217 26521 26542 Error Unity UnityEngine.XR.MagicLeap.MLPermissions:OnPermissionGranted(String)
2023/05/28 13:28:09.217 26521 26542 Error Unity System.Reflection.RuntimeMethodInfo:Invoke(Object, BindingFlags, Binder, Object[], CultureInfo)
2023/05/28 13:28:09.217 26521 26542 Error Unity System.Reflection.MethodBase:Invoke(Object, Object[])
2023/05/28 13:28:09.217 26521 26542 Error Unity UnityEngine.AndroidJavaProxy:Invoke(String, Object[])
2023/05/28 13:28:09.217 26521 26542 Error Unity UnityEngine.AndroidJavaProxy:Invoke(String, IntPtr)
2023/05/28 13:28:09.217 26521 26542 Error Unity UnityEngine._AndroidJNIHelper:InvokeJavaProxyMethod(AndroidJavaProxy, IntPtr, IntPtr)
2023/05/28 13:28:09.217 26521 26542 Error Unity 
2023/05/28 13:28:09.219 26521 26542 Info Input6DofFilter input_6dof_filter.cpp:92: Input6DofFilter was created
2023/05/28 13:28:09.220 26521 26542 Info Input6DofFilter input_6dof_filter.cpp:92: Input6DofFilter was created
2023/05/28 13:28:09.220 26521 26542 Info ml_input ml_input.cpp:1382: At application start, found controller device 257, type 1, registered as index 0
2023/05/28 13:28:09.221 26521 26542 Error Unity NullReferenceException: Object reference not set to an instance of an object.
2023/05/28 13:28:09.221 26521 26542 Error Unity 
2023/05/28 13:28:09.221 26521 26542 Error Unity 
2023/05/28 13:28:09.223 26521 26542 Error Unity Error: MLCameraConnect in the Magic Leap API failed. Reason: MLResult_InvalidParam 
2023/05/28 13:28:09.223 26521 26542 Error Unity UnityEngine.XR.MagicLeap.MLResult:DidNativeCallSucceed(Code, String, Predicate`1, Boolean)
2023/05/28 13:28:09.223 26521 26542 Error Unity UnityEngine.XR.MagicLeap.MLCameraBase:InternalConnect(ConnectContext)
2023/05/28 13:28:09.223 26521 26542 Error Unity UnityEngine.XR.MagicLeap.MLCamera:Resume()
2023/05/28 13:28:09.223 26521 26542 Error Unity UnityEngine.XR.MagicLeap.MLCamera:OnApplicationPause(Boolean)
2023/05/28 13:28:09.223 26521 26542 Error Unity UnityEngine.XR.MagicLeap.MLDevice:OnApplicationPause(Boolean)
2023/05/28 13:28:09.223 26521 26542 Error Unity 
2023/05/28 13:28:09.223 26521 26542 Error Unity MLCamera.Connect failed connecting to the camera. Reason: InvalidParam
2023/05/28 13:28:09.224 26521 26542 Error Unity Error: MLCamera.Resume failed to connect camera. Reason: InvalidParam
2023/05/28 13:28:09.224 26521 26542 Error Unity UnityEngine.XR.MagicLeap.MLCamera:Resume()
2023/05/28 13:28:09.224 26521 26542 Error Unity UnityEngine.XR.MagicLeap.MLCamera:OnApplicationPause(Boolean)
2023/05/28 13:28:09.224 26521 26542 Error Unity UnityEngine.XR.MagicLeap.MLDevice:OnApplicationPause(Boolean)
2023/05/28 13:28:09.224 26521 26542 Error Unity 
2023/05/28 13:28:09.224 26521 26542 Error Unity MLCamera.ApplicationPause failed to resume the camera. Reason: InvalidParam
2023/05/28 13:28:09.224 26521 26542 Error Unity UnityEngine.XR.MagicLeap.MLCamera:OnApplicationPause(Boolean)
2023/05/28 13:28:09.224 26521 26542 Error Unity UnityEngine.XR.MagicLeap.MLDevice:OnApplicationPause(Boolean)
2023/05/28 13:28:09.224 26521 26542 Error Unity 
2023/05/28 13:28:09.227 26521 26538 Info ml_input ml_input.cpp:198: on_trigger, controller_id: 0, trigger event: Pull, trigger depth: 0.313726
2023/05/28 13:28:09.248 26521 26699 Info AudioTrack Format:
2023/05/28 13:28:09.248 26521 26699 Info AudioTrack   samples_per_second=24000
2023/05/28 13:28:09.248 26521 26699 Info AudioTrack   channel_count=2
2023/05/28 13:28:09.248 26521 26699 Info AudioTrack   bits_per_sample=16
2023/05/28 13:28:09.248 26521 26699 Info AudioTrack   valid_bits_per_sample=16
2023/05/28 13:28:09.248 26521 26699 Info AudioTrack   sample_format=0
2023/05/28 13:28:09.248 26521 26699 Info AudioTrack   channel_format=0
2023/05/28 13:28:09.249 26521 26699 Warn AudioTrack Use of stream types is deprecated for operations other than volume control
2023/05/28 13:28:09.249 26521 26699 Warn AudioTrack See the documentation of AudioTrack() for what to use instead with android.media.AudioAttributes to qualify your playback use case
2023/05/28 13:28:09.250 26521 26521 Verbose MediaRouter Selecting route: RouteInfo{ name=Phone, description=null, status=null, category=RouteCategory{ name=System types=ROUTE_TYPE_LIVE_AUDIO ROUTE_TYPE_LIVE_VIDEO  groupable=false }, supportedTypes=ROUTE_TYPE_LIVE_AUDIO ROUTE_TYPE_LIVE_VIDEO , presentationDisplay=null }
2023/05/28 13:28:09.257 26521 26538 Info ml_input ml_input.cpp:198: on_trigger, controller_id: 0, trigger event: Pull, trigger depth: 0.145098
2023/05/28 13:28:09.287 26521 26538 Info ml_input ml_input.cpp:198: on_trigger, controller_id: 0, trigger event: Release, trigger depth: 0.000000
2023/05/28 13:28:17.257 26521 26542 Debug RefBase RefBase: Explicit destruction, weak count = 0 (in 0x71c8605ccd58)
2023/05/28 13:28:17.257 26521 26542 Warn RefBase CallStack::getCurrentInternal not linked, returning null
2023/05/28 13:28:17.257 26521 26542 Warn RefBase CallStack::logStackInternal not linked
2023/05/28 13:28:17.257 26521 26542 Error Unity NullReferenceException: Object reference not set to an instance of an object.
2023/05/28 13:28:17.257 26521 26542 Error Unity   at MagicLeap.Examples.CameraCaptureExample.DisableImageCaptureObject () [0x00000] in <00000000000000000000000000000000>:0 
2023/05/28 13:28:17.257 26521 26542 Error Unity   at MagicLeap.Examples.CameraCaptureExample.OnApplicationPause (System.Boolean isPaused) [0x00000] in <00000000000000000000000000000000>:0 
2023/05/28 13:28:17.257 26521 26542 Error Unity 
2023/05/28 13:28:17.287 26521 26731 Info ml_camera_client Camonly OnAvailable CamId = 0
2023/05/28 13:28:18.745 26521 26521 Info Process Sending signal. PID: 26521 SIG: 9

Some background on my project: I am working on a camera capture example, where I aim to handle
the camera's availability, connect, enable, preview it on canvas background and finally disable it upon pausing the application. I suspect the issues come from the DisableImageCaptureObject() method within the CameraCaptureExample class, and TryEnableMLCamera() method within the TestCameraRecording class as these are where the exceptions are thrown according to the stack trace.

I'm also getting errors when trying to use MLCamera.Connect and MLCamera.Resume methods, with an InvalidParam error. This issue is not clear to me as I believe I'm providing all the necessary parameters.

Here is the current version of the Script:

using System;
using System.Collections;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading.Tasks;
using MagicLeap.Core;
using UnityEngine;
using UnityEngine.UI;
using UnityEngine.XR.MagicLeap;


namespace MagicLeap.Examples
{

    /// <summary>
    /// This class handles video recording and image capturing based on controller
    /// input.
    /// </summary>
    /// 

   
    public class TestCameraRecording : MonoBehaviour
    {
        private MLCamera.CaptureFrameRate FrameRate ;
        private MLCamera.OutputFormat OutputFormat ;
        private MLCamera captureCamera;
        private bool isCapturingVideo = false;

        [SerializeField, Tooltip("Reference to media player behavior used in camera capture playback")]
        private MLMediaPlayerBehavior mediaPlayerBehavior;

        private readonly CameraRecorder cameraRecorder = new CameraRecorder();
        private const string validFileFormat = ".mp4";

        private bool isCapturingPreview = false;
        private bool RecordToFile = true;

        private string recordedFilePath;
        private MLCamera.CaptureType CaptureType = MLCamera.CaptureType.Video;

        private List<MLCamera.StreamCapability> streamCapabilities;

        [SerializeField, Tooltip("Button that starts the Capture")]
        private Button captureButton;
        private readonly MLPermissions.Callbacks permissionCallbacks = new MLPermissions.Callbacks();

        private bool cameraDeviceAvailable;

        [SerializeField, Tooltip("Refrence to the Raw Video Capture Visualizer gameobject for YUV frames")]
        private CameraCaptureVisualizer cameraCaptureVisualizer = null;

        private void Awake()
        {
            permissionCallbacks.OnPermissionGranted += OnPermissionGranted;
            permissionCallbacks.OnPermissionDenied += OnPermissionDenied;
            permissionCallbacks.OnPermissionDeniedAndDontAskAgain += OnPermissionDenied;

            //connectionFlagDropdown.AddOptions(
            //    MLCamera.ConnectFlag.CamOnly,
            //    MLCamera.ConnectFlag.MR,
            //    MLCamera.ConnectFlag.VirtualOnly);

            //captureButton.onClick.AddListener(OnCaptureButtonClicked);
            //connectButton.onClick.AddListener(ConnectCamera);
            //disconnectButton.onClick.AddListener(DisconnectCamera);
            //connectionFlagDropdown.onValueChanged.AddListener(v => RefreshUI());
            //streamCapabilitiesDropdown.onValueChanged.AddListener(v => RefreshUI());
            //qualityDropDown.onValueChanged.AddListener(v => RefreshUI());
            //captureTypeDropDown.onValueChanged.AddListener(v => RefreshUI());
            //frameRateDropDown.onValueChanged.AddListener(v => RefreshUI());

            //RefreshUI();
        }

        // Start is called before the first frame update

        private void Start()
        {
            Debug.Log("Start");
            MLPermissions.RequestPermission(MLPermission.Camera, permissionCallbacks);
            MLPermissions.RequestPermission(MLPermission.RecordAudio, permissionCallbacks);

            TryEnableMLCamera();
        }
        
        private void TryEnableMLCamera()
        {
            if (!MLPermissions.CheckPermission(MLPermission.Camera).IsOk)
                return;

            StartCoroutine(EnableMLCamera());
        }

        private IEnumerator EnableMLCamera()
        {
            while (!cameraDeviceAvailable)
            {
                MLResult result =
                    MLCamera.GetDeviceAvailabilityStatus(MLCamera.Identifier.Main, out cameraDeviceAvailable);
                if (!(result.IsOk && cameraDeviceAvailable))
                {
                    // Wait until camera device is available
                    yield return new WaitForSeconds(1.0f);
                }
                else
        {
                        ConnectCamera();

            // Camera device is available, start video capture here
        }
            }

            Debug.Log("Camera device available");
        }

        // Update is called once per frame
        void Update()
        {
        }


        private void OnPermissionDenied(string permission)
        {
            if (permission == MLPermission.Camera)
            {
                MLPluginLog.Error($"{permission} denied, example won't function.");
            }
            else if (permission == MLPermission.RecordAudio)
            {
                MLPluginLog.Error($"{permission} denied, audio wont be recorded in the file.");
            }

            //RefreshUI();
        }



        private void OnPermissionGranted(string permission)
        {
            MLPluginLog.Debug($"Granted {permission}.");
            TryEnableMLCamera();

            //RefreshUI();
        }

        private void StartVideoCapture()
        {
            // recordedFilePath = string.Empty;
            // skipFrame = false;

            var result = MLPermissions.CheckPermission(MLPermission.Camera);
            MLResult.DidNativeCallSucceed(result.Result, nameof(MLPermissions.RequestPermission));

            if (!result.IsOk)
            {
                Debug.LogError($"{MLPermission.Camera} permission denied. Video will not be recorded.");
                return;
            }

            if (RecordToFile)
                StartRecording();
            else
                StartPreview();
        }

        private void StartRecording()
        {
            // media player not supported in Magic Leap App Simulator
#if !UNITY_EDITOR
            mediaPlayerBehavior.MediaPlayer.OnPrepared += MediaPlayerOnOnPrepared;
            mediaPlayerBehavior.MediaPlayer.OnCompletion += MediaPlayerOnCompletion;
#endif
            string fileName = DateTime.Now.ToString("MM_dd_yyyy__HH_mm_ss") + validFileFormat;
            recordedFilePath = System.IO.Path.Combine(Application.persistentDataPath, fileName);

            CameraRecorderConfig config = CameraRecorderConfig.CreateDefault();
            config.Width = streamCapabilities[0].Width;
            config.Height = streamCapabilities[0].Height;
            config.FrameRate = MapFrameRate(MLCamera.CaptureFrameRate._60FPS);

            cameraRecorder.StartRecording(recordedFilePath, config);

            int MapFrameRate(MLCamera.CaptureFrameRate frameRate)
            {
                switch (frameRate)
                {
                    case MLCamera.CaptureFrameRate.None: return 0;
                    case MLCamera.CaptureFrameRate._15FPS: return 15;
                    case MLCamera.CaptureFrameRate._30FPS: return 30;
                    case MLCamera.CaptureFrameRate._60FPS: return 60;
                    default: return 0;
                }
            }

            MLCamera.CaptureConfig captureConfig = new MLCamera.CaptureConfig();
            captureConfig.CaptureFrameRate = FrameRate;
            captureConfig.StreamConfigs = new MLCamera.CaptureStreamConfig[1];
            captureConfig.StreamConfigs[0] = MLCamera.CaptureStreamConfig.Create(streamCapabilities[0], OutputFormat);
            captureConfig.StreamConfigs[0].Surface = cameraRecorder.MediaRecorder.InputSurface;

            MLResult result = captureCamera.PrepareCapture(captureConfig, out MLCamera.Metadata _);

            if (MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.PrepareCapture)))
            {
                captureCamera.PreCaptureAEAWB();

                if (CaptureType == MLCamera.CaptureType.Video)
                {
                    result = captureCamera.CaptureVideoStart();
                    isCapturingVideo = MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.CaptureVideoStart));
                    if (isCapturingVideo)
                    {
                        cameraCaptureVisualizer.DisplayCapture(captureConfig.StreamConfigs[0].OutputFormat, RecordToFile);
                    }
                }

                if (CaptureType == MLCamera.CaptureType.Preview)
                {
                    result = captureCamera.CapturePreviewStart();
                    isCapturingPreview = MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.CapturePreviewStart));
                    if (isCapturingPreview)
                    {
                        cameraCaptureVisualizer.DisplayPreviewCapture(captureCamera.PreviewTexture, RecordToFile);
                    }
                }
            }
        }
        
        private void StartPreview()
        {
            MLCamera.CaptureConfig captureConfig = new MLCamera.CaptureConfig();
            captureConfig.CaptureFrameRate = MLCamera.CaptureFrameRate._60FPS;
            captureConfig.StreamConfigs = new MLCamera.CaptureStreamConfig[1];
            captureConfig.StreamConfigs[0] =
                MLCamera.CaptureStreamConfig.Create(streamCapabilities[0], OutputFormat);

            MLResult result = captureCamera.PrepareCapture(captureConfig, out MLCamera.Metadata _);

            if (MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.PrepareCapture)))
            {
                captureCamera.PreCaptureAEAWB();

                // if (CaptureType == MLCamera.CaptureType.Video)
                // {
                    // result = captureCamera.CaptureVideoStart();
                    // isCapturingVideo = MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.CaptureVideoStart));
                    // if (isCapturingVideo)
                    // {
                    //     cameraCaptureVisualizer.DisplayCapture(captureConfig.StreamConfigs[0].OutputFormat, true);
                    // }
                // }

                // if (CaptureType == MLCamera.CaptureType.Preview)
                // {
                    result = captureCamera.CapturePreviewStart();
                    isCapturingPreview = MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.CapturePreviewStart));
                          Debug.LogError($"isCapturingPreview {isCapturingPreview} ");               
                    // if (isCapturingPreview)
                    // {
                    
                        cameraCaptureVisualizer.DisplayPreviewCapture(captureCamera.PreviewTexture, true);
                    // }
                // }
            }
        }

        private void ConnectCamera()
        {
            MLCamera.ConnectContext context = MLCamera.ConnectContext.Create();
            context.Flags = MLCamera.ConnectFlag.CamOnly;
            context.EnableVideoStabilization = true;

            if (context.Flags != MLCamera.ConnectFlag.CamOnly)
            {
                context.MixedRealityConnectInfo = MLCamera.MRConnectInfo.Create();
                context.MixedRealityConnectInfo.MRQuality = MLCamera.MRQuality._960x720;
                context.MixedRealityConnectInfo.MRBlendType = MLCamera.MRBlendType.Additive;
                context.MixedRealityConnectInfo.FrameRate = MLCamera.CaptureFrameRate._60FPS;
            }

            captureCamera = MLCamera.CreateAndConnect(context);

            if (captureCamera != null)
            {
                Debug.Log("Camera device connected");
                if (GetImageStreamCapabilities())
                {
                    Debug.Log("Camera device received stream caps");
                    // captureCamera.OnRawVideoFrameAvailable += OnCaptureRawVideoFrameAvailable;
                    // captureCamera.OnRawImageAvailable += OnCaptureRawImageComplete;

                    StartVideoCapture();
                }
            }

        }

        private bool GetImageStreamCapabilities()
        {
            var result =
                captureCamera.GetStreamCapabilities(out MLCamera.StreamCapabilitiesInfo[] streamCapabilitiesInfo);

            if (!result.IsOk)
            {
                Debug.Log("Could not get Stream capabilities Info.");
                return false;
            }

            streamCapabilities = new List<MLCamera.StreamCapability>();

            for (int i = 0; i < streamCapabilitiesInfo.Length; i++)
            {
                foreach (var streamCap in streamCapabilitiesInfo[i].StreamCapabilities)
                {
                    streamCapabilities.Add(streamCap);
                }
            }

            return streamCapabilities.Count > 0;
        }

        private void MediaPlayerOnOnPrepared(MLMedia.Player mediaplayer)
        {
            // media player not supported in Magic Leap App Simulator
#if !UNITY_EDITOR
            mediaPlayerBehavior.Play();
#endif
        }

        private void MediaPlayerOnCompletion(MLMedia.Player mediaplayer)
        {
            // media player not supported in Magic Leap App Simulator
#if !UNITY_EDITOR
            mediaPlayerBehavior.StopMLMediaPlayer();
#endif
            mediaPlayerBehavior.gameObject.SetActive(false);
            mediaPlayerBehavior.Reset();
        }
    
    }

}       

Could anyone provide insights or solutions on these issues? Any help would be much appreciated!

Thanks in advance,

Muhammad Usman Bashir

I have updated the file script after doing some more research on unity examples of magic leap 2.

using System;
using System.Collections;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading.Tasks;
using MagicLeap.Core;
using UnityEngine;
using UnityEngine.UI;
using UnityEngine.XR.MagicLeap;
​
​
namespace MagicLeap.Examples
{
​
    /// <summary>
    /// This class handles video recording and image capturing based on controller
    /// input.
    /// </summary>
    /// 
​
​
    public class TestCameraRecording : MonoBehaviour
    {
        private MLCamera.CaptureFrameRate FrameRate = MLCamera.CaptureFrameRate._60FPS;
        private MLCamera.OutputFormat OutputFormat = MLCamera.OutputFormat.RGBA_8888;
        private MLCamera captureCamera;
        private bool isCapturingVideo = false;
​
        private bool skipFrame = false;
​
​
        [SerializeField, Tooltip("Reference to media player behavior used in camera capture playback")]
        private MLMediaPlayerBehavior mediaPlayerBehavior;
​
        private readonly CameraRecorder cameraRecorder = new CameraRecorder();
​
​
        private const string validFileFormat = ".mp4";
​
        private bool isCapturingPreview = false;
        private bool RecordToFile = true;
​
        private string recordedFilePath;
        private MLCamera.CaptureType CaptureType = MLCamera.CaptureType.Video;
​
​
        private List<MLCamera.StreamCapability> streamCapabilities;
​
        [SerializeField, Tooltip("Button that starts the Capture")]
        private Button captureButton;
        private readonly MLPermissions.Callbacks permissionCallbacks = new MLPermissions.Callbacks();
​
        private bool cameraDeviceAvailable;
​
        [SerializeField, Tooltip("Refrence to the Raw Video Capture Visualizer gameobject for YUV frames")]
        private CameraCaptureVisualizer cameraCaptureVisualizer = null;
​
​
        private void Awake()
        {
            permissionCallbacks.OnPermissionGranted += OnPermissionGranted;
            permissionCallbacks.OnPermissionDenied += OnPermissionDenied;
            permissionCallbacks.OnPermissionDeniedAndDontAskAgain += OnPermissionDenied;
​
            //connectionFlagDropdown.AddOptions(
            //    MLCamera.ConnectFlag.CamOnly,
            //    MLCamera.ConnectFlag.MR,
            //    MLCamera.ConnectFlag.VirtualOnly);
​
            //captureButton.onClick.AddListener(OnCaptureButtonClicked);
            //connectButton.onClick.AddListener(ConnectCamera);
            //disconnectButton.onClick.AddListener(DisconnectCamera);
            //connectionFlagDropdown.onValueChanged.AddListener(v => RefreshUI());
            //streamCapabilitiesDropdown.onValueChanged.AddListener(v => RefreshUI());
            //qualityDropDown.onValueChanged.AddListener(v => RefreshUI());
            //captureTypeDropDown.onValueChanged.AddListener(v => RefreshUI());
            //frameRateDropDown.onValueChanged.AddListener(v => RefreshUI());
​
            //RefreshUI();
        }
​
        // Start is called before the first frame update
​
        private void Start()
        {
            Debug.Log("Start");
            MLPermissions.RequestPermission(MLPermission.Camera, permissionCallbacks);
            MLPermissions.RequestPermission(MLPermission.RecordAudio, permissionCallbacks);
​
            TryEnableMLCamera();
        }
​
        private void TryEnableMLCamera()
        {
            if (!MLPermissions.CheckPermission(MLPermission.Camera).IsOk)
                return;
​
            StartCoroutine(EnableMLCamera());
        }
​
        private IEnumerator EnableMLCamera()
        {
            while (!cameraDeviceAvailable)
            {
                MLResult result =
                    MLCamera.GetDeviceAvailabilityStatus(MLCamera.Identifier.Main, out cameraDeviceAvailable);
                if (!(result.IsOk && cameraDeviceAvailable))
                {
                    // Wait until camera device is available
                    yield return new WaitForSeconds(1.0f);
                }
                else
                {
                    ConnectCamera();
​
                    // Camera device is available, start video capture here
                }
            }
​
            Debug.Log("Camera device available");
        }
​
        // Update is called once per frame
        void Update()
        {
        }
​
​
        private void OnPermissionDenied(string permission)
        {
            if (permission == MLPermission.Camera)
            {
                MLPluginLog.Error($"{permission} denied, example won't function.");
            }
            else if (permission == MLPermission.RecordAudio)
            {
                MLPluginLog.Error($"{permission} denied, audio wont be recorded in the file.");
            }
​
            //RefreshUI();
        }
​
​
​
        private void OnPermissionGranted(string permission)
        {
            MLPluginLog.Debug($"Granted {permission}.");
            TryEnableMLCamera();
​
            //RefreshUI();
        }
​
        private void StartVideoCapture()
        {
            // recordedFilePath = string.Empty;
            // skipFrame = false;
​
            var result = MLPermissions.CheckPermission(MLPermission.Camera);
            MLResult.DidNativeCallSucceed(result.Result, nameof(MLPermissions.RequestPermission));
            Debug.Log($"CLPermissions.CheckPermission {result}");
            if (!result.IsOk)
            {
                Debug.LogError($"{MLPermission.Camera} permission denied. Video will not be recorded.");
                return;
            }
​
            if (!RecordToFile)
                StartRecording();
            else
                StartPreview();
        }
​
        private void StartRecording()
        {
            // media player not supported in Magic Leap App Simulator
#if !UNITY_EDITOR
            mediaPlayerBehavior.MediaPlayer.OnPrepared += MediaPlayerOnOnPrepared;
            mediaPlayerBehavior.MediaPlayer.OnCompletion += MediaPlayerOnCompletion;
#endif
            string fileName = DateTime.Now.ToString("MM_dd_yyyy__HH_mm_ss") + validFileFormat;
            recordedFilePath = System.IO.Path.Combine(Application.persistentDataPath, fileName);
​
            CameraRecorderConfig config = CameraRecorderConfig.CreateDefault();
            config.Width = streamCapabilities[3].Width;
            config.Height = streamCapabilities[3].Height;
            config.FrameRate = MapFrameRate(MLCamera.CaptureFrameRate._60FPS);
​
            cameraRecorder.StartRecording(recordedFilePath, config);
​
            int MapFrameRate(MLCamera.CaptureFrameRate frameRate)
            {
                switch (frameRate)
                {
                    case MLCamera.CaptureFrameRate.None: return 0;
                    case MLCamera.CaptureFrameRate._15FPS: return 15;
                    case MLCamera.CaptureFrameRate._30FPS: return 30;
                    case MLCamera.CaptureFrameRate._60FPS: return 60;
                    default: return 0;
                }
            }
​
            MLCamera.CaptureConfig captureConfig = new MLCamera.CaptureConfig();
            captureConfig.CaptureFrameRate = FrameRate;
            captureConfig.StreamConfigs = new MLCamera.CaptureStreamConfig[1];
            captureConfig.StreamConfigs[0] = MLCamera.CaptureStreamConfig.Create(streamCapabilities[3], OutputFormat);
            captureConfig.StreamConfigs[0].Surface = cameraRecorder.MediaRecorder.InputSurface;
​
            MLResult result = captureCamera.PrepareCapture(captureConfig, out MLCamera.Metadata _);
​
            Debug.Log($"Check Camera is ready for capture {MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.PrepareCapture))}");
​
            if (MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.PrepareCapture)))
            {
                captureCamera.PreCaptureAEAWB();
​
                if (CaptureType == MLCamera.CaptureType.Video)
                {
                    result = captureCamera.CaptureVideoStart();
                    isCapturingVideo = MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.CaptureVideoStart));
​
                    Debug.LogError($"isCapturingVideo {isCapturingVideo} ");
​
                    if (isCapturingVideo)
                    {
                        cameraCaptureVisualizer.DisplayCapture(captureConfig.StreamConfigs[0].OutputFormat, RecordToFile);
                    }
                }
​
                if (CaptureType == MLCamera.CaptureType.Preview)
                {
                    result = captureCamera.CapturePreviewStart();
                    isCapturingPreview = MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.CapturePreviewStart));
                    if (isCapturingPreview)
                    {
                        cameraCaptureVisualizer.DisplayPreviewCapture(captureCamera.PreviewTexture, RecordToFile);
                    }
                }
            }
        }
        private void StartPreview()
        {
            MLCamera.CaptureConfig captureConfig = new MLCamera.CaptureConfig();
            captureConfig.CaptureFrameRate = MLCamera.CaptureFrameRate._60FPS;
            captureConfig.StreamConfigs = new MLCamera.CaptureStreamConfig[1];
            captureConfig.StreamConfigs[0] =
                MLCamera.CaptureStreamConfig.Create(streamCapabilities[3], OutputFormat);
​
            MLResult result = captureCamera.PrepareCapture(captureConfig, out MLCamera.Metadata _);
​
            if (MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.PrepareCapture)))
            {
                captureCamera.PreCaptureAEAWB();
​
                if (CaptureType == MLCamera.CaptureType.Video)
                {
                    result = captureCamera.CaptureVideoStart();
                    isCapturingVideo = MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.CaptureVideoStart));
​
                    Debug.LogError($"isCapturingVideo {isCapturingVideo} ");
                    if (isCapturingVideo)
                    {
                        cameraCaptureVisualizer.DisplayCapture(captureConfig.StreamConfigs[0].OutputFormat, true);
                    }
                }
​
                if (CaptureType == MLCamera.CaptureType.Preview)
                {
                    result = captureCamera.CapturePreviewStart();
                    isCapturingPreview = MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.CapturePreviewStart));
                    Debug.LogError($"isCapturingPreview {isCapturingPreview} ");
                    if (isCapturingPreview)
                    {
​
                        cameraCaptureVisualizer.DisplayPreviewCapture(captureCamera.PreviewTexture, true);
                    }
                }
            }
        }
​
        private void ConnectCamera()
        {
            MLCamera.ConnectContext context = MLCamera.ConnectContext.Create();
            context.Flags = MLCamera.ConnectFlag.CamOnly;
            context.EnableVideoStabilization = true;
​
            if (context.Flags != MLCamera.ConnectFlag.CamOnly)
            {
                context.MixedRealityConnectInfo = MLCamera.MRConnectInfo.Create();
                context.MixedRealityConnectInfo.MRQuality = MLCamera.MRQuality._960x720;
                context.MixedRealityConnectInfo.MRBlendType = MLCamera.MRBlendType.Additive;
                context.MixedRealityConnectInfo.FrameRate = MLCamera.CaptureFrameRate._60FPS;
            }
​
            captureCamera = MLCamera.CreateAndConnect(context);
​
            if (captureCamera != null)
            {
                Debug.Log("Camera device connected");
                if (GetImageStreamCapabilities())
                {
                    Debug.Log("Camera device received stream caps");
                    captureCamera.OnRawVideoFrameAvailable += OnCaptureRawVideoFrameAvailable;
                    captureCamera.OnRawImageAvailable += OnCaptureRawImageComplete;
​
                    StartVideoCapture();
                }
            }
​
        }
​
        private bool GetImageStreamCapabilities()
        {
            var result =
                captureCamera.GetStreamCapabilities(out MLCamera.StreamCapabilitiesInfo[] streamCapabilitiesInfo);
​
            if (!result.IsOk)
            {
                Debug.Log("Could not get Stream capabilities Info.");
                return false;
            }
​
            streamCapabilities = new List<MLCamera.StreamCapability>();
​
            for (int i = 0; i < streamCapabilitiesInfo.Length; i++)
            {
                foreach (var streamCap in streamCapabilitiesInfo[i].StreamCapabilities)
                {
                    Debug.LogError($"streamCapabilitiesInfo {streamCap} ");
                    streamCapabilities.Add(streamCap);
                }
            }
​
            return streamCapabilities.Count > 0;
        }
​
        private void MediaPlayerOnOnPrepared(MLMedia.Player mediaplayer)
        {
            // media player not supported in Magic Leap App Simulator
#if !UNITY_EDITOR
            mediaPlayerBehavior.Play();
#endif
        }
​
        private void MediaPlayerOnCompletion(MLMedia.Player mediaplayer)
        {
            // media player not supported in Magic Leap App Simulator
#if !UNITY_EDITOR
            mediaPlayerBehavior.StopMLMediaPlayer();
#endif
            mediaPlayerBehavior.gameObject.SetActive(false);
            mediaPlayerBehavior.Reset();
        }
​
        private void OnCaptureRawVideoFrameAvailable(MLCamera.CameraOutput capturedFrame, MLCamera.ResultExtras resultExtras, MLCamera.Metadata metadataHandle)
        {
            // if (string.IsNullOrEmpty(captureInfoText.text) && isCapturingVideo)
            // {
            // captureInfoText.text = capturedFrame.ToString();
            // }
​
​
​
            if (OutputFormat == MLCamera.OutputFormat.RGBA_8888 && FrameRate == MLCamera.CaptureFrameRate._30FPS && streamCapabilities[0].Width >= 4096)
            {
                // cameraCaptureVisualizer cannot handle throughput of RGBA_8888 4096x3072 at 30 fps 
                skipFrame = !skipFrame;
                if (skipFrame)
                {
                    return;
                }
            }
            cameraCaptureVisualizer.OnCaptureDataReceived(resultExtras, capturedFrame);
        }
​
        /// <summary>
        /// Handles the event of a new image getting captured.
        /// </summary>
        /// <param name="capturedImage">Captured frame.</param>
        /// <param name="resultExtras">Results Extras.</param>
        private void OnCaptureRawImageComplete(MLCamera.CameraOutput capturedImage, MLCamera.ResultExtras resultExtras, MLCamera.Metadata metadataHandle)
        {
​
            // isDisplayingImage = true;
            cameraCaptureVisualizer.OnCaptureDataReceived(resultExtras, capturedImage);
​
            if (RecordToFile)
            {
                if (capturedImage.Format != MLCamera.OutputFormat.YUV_420_888)
                {
                    string fileName = DateTime.Now.ToString("MM_dd_yyyy__HH_mm_ss") + ".jpg";
                    recordedFilePath = System.IO.Path.Combine(Application.persistentDataPath, fileName);
                    try
                    {
                        File.WriteAllBytes(recordedFilePath, capturedImage.Planes[0].Data);
                        // captureInfoText.text += $"\nSaved to {recordedFilePath}";
                    }
                    catch (Exception e)
                    {
                        Debug.LogError(e.Message);
                    }
                }
            }
        }
​
​
​
    }
​
}

Now, Issues have been changed. If you can make a simple unity app based on magic leap 2 SDK and run this script. It will give you all possible insights.

Kind regards.

On further improvements & investigations across the application, I am now stuck at;

Application remains stuck at

            MLResult result = captureCamera.PrepareCapture(captureConfig, out MLCamera.Metadata _);

Which is available in the following function;


 private void StartPreview()
        {
            MLCamera.CaptureConfig captureConfig = new MLCamera.CaptureConfig();
            captureConfig.CaptureFrameRate = MLCamera.CaptureFrameRate._60FPS;
            captureConfig.StreamConfigs = new MLCamera.CaptureStreamConfig[1];
            captureConfig.StreamConfigs[0] =
                MLCamera.CaptureStreamConfig.Create(streamCapabilities[3], OutputFormat);

            MLResult result = captureCamera.PrepareCapture(captureConfig, out MLCamera.Metadata _);

            if (MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.PrepareCapture)))
            {
                captureCamera.PreCaptureAEAWB();

                if (CaptureType == MLCamera.CaptureType.Video)
                {
                    result = captureCamera.CaptureVideoStart();
                    isCapturingVideo = MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.CaptureVideoStart));

                    Debug.LogError($"isCapturingVideo {isCapturingVideo} ");
                    if (isCapturingVideo)
                    {
                        cameraCaptureVisualizer.DisplayCapture(captureConfig.StreamConfigs[0].OutputFormat, true);
                    }
                }

                if (CaptureType == MLCamera.CaptureType.Preview)
                {
                    result = captureCamera.CapturePreviewStart();
                    isCapturingPreview = MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.CapturePreviewStart));
                    Debug.LogError($"isCapturingPreview {isCapturingPreview} ");
                    if (isCapturingPreview)
                    {

                        cameraCaptureVisualizer.DisplayPreviewCapture(captureCamera.PreviewTexture, true);
                    }
                }
            }
        }

I am making progress to my research work. I am not seeing the above errors now but I am still unable to preview camera in the background. Camera is getting connected and I can see green color indicator on the App Screen while running the app.

2023/05/30 12:37:57.592 11776 11797 Info Unity Change to UI - Disabled last UI: Canvas_Menu_Login
2023/05/30 12:37:57.592 11776 11797 Info Unity UIStateController:ChangeToUI(GameObject, GameEventListener, Boolean)
2023/05/30 12:37:57.592 11776 11797 Info Unity UnityEngine.Events.UnityEvent:Invoke()
2023/05/30 12:37:57.592 11776 11797 Info Unity GameEvent:Raise()
2023/05/30 12:37:57.592 11776 11797 Info Unity UnityEngine.Events.UnityEvent:Invoke()
2023/05/30 12:37:57.592 11776 11797 Info Unity UnityEngine.UI.Button:OnSubmit(BaseEventData)
2023/05/30 12:37:57.592 11776 11797 Info Unity UnityEngine.EventSystems.ExecuteEvents:Execute(GameObject, BaseEventData, EventFunction`1)
2023/05/30 12:37:57.592 11776 11797 Info Unity WF_LaserPointer:OnPointerClick()
2023/05/30 12:37:57.592 11776 11797 Info Unity <ClickCycle>d__23:MoveNext()
2023/05/30 12:37:57.592 11776 11797 Info Unity UnityEngine.SetupCoroutine:InvokeMoveNext(IEnumerator, IntPtr)
2023/05/30 12:37:57.592 11776 11797 Info Unity 
2023/05/30 12:37:57.593 11776 11797 Info Unity Change to UI - Inserted new UI on stack: Canvas_Record
2023/05/30 12:37:57.593 11776 11797 Info Unity UIStateController:ChangeToUI(GameObject, GameEventListener, Boolean)
2023/05/30 12:37:57.593 11776 11797 Info Unity UnityEngine.Events.UnityEvent:Invoke()
2023/05/30 12:37:57.593 11776 11797 Info Unity GameEvent:Raise()
2023/05/30 12:37:57.593 11776 11797 Info Unity UnityEngine.Events.UnityEvent:Invoke()
2023/05/30 12:37:57.593 11776 11797 Info Unity UnityEngine.UI.Button:OnSubmit(BaseEventData)
2023/05/30 12:37:57.593 11776 11797 Info Unity UnityEngine.EventSystems.ExecuteEvents:Execute(GameObject, BaseEventData, EventFunction`1)
2023/05/30 12:37:57.593 11776 11797 Info Unity WF_LaserPointer:OnPointerClick()
2023/05/30 12:37:57.593 11776 11797 Info Unity <ClickCycle>d__23:MoveNext()
2023/05/30 12:37:57.593 11776 11797 Info Unity UnityEngine.SetupCoroutine:InvokeMoveNext(IEnumerator, IntPtr)
2023/05/30 12:37:57.593 11776 11797 Info Unity 
2023/05/30 12:37:57.594 11776 11797 Info Unity ------------
2023/05/30 12:37:57.594 11776 11797 Info Unity UIStateController:ChangeToUI(GameObject, GameEventListener, Boolean)
2023/05/30 12:37:57.594 11776 11797 Info Unity UnityEngine.Events.UnityEvent:Invoke()
2023/05/30 12:37:57.594 11776 11797 Info Unity GameEvent:Raise()
2023/05/30 12:37:57.594 11776 11797 Info Unity UnityEngine.Events.UnityEvent:Invoke()
2023/05/30 12:37:57.594 11776 11797 Info Unity UnityEngine.UI.Button:OnSubmit(BaseEventData)
2023/05/30 12:37:57.594 11776 11797 Info Unity UnityEngine.EventSystems.ExecuteEvents:Execute(GameObject, BaseEventData, EventFunction`1)
2023/05/30 12:37:57.594 11776 11797 Info Unity WF_LaserPointer:OnPointerClick()
2023/05/30 12:37:57.594 11776 11797 Info Unity <ClickCycle>d__23:MoveNext()
2023/05/30 12:37:57.594 11776 11797 Info Unity UnityEngine.SetupCoroutine:InvokeMoveNext(IEnumerator, IntPtr)
2023/05/30 12:37:57.594 11776 11797 Info Unity 
2023/05/30 12:37:57.594 11776 11797 Info Unity Canvas_Record
2023/05/30 12:37:57.594 11776 11797 Info Unity Canvas_Menu_Login
2023/05/30 12:37:57.594 11776 11797 Info Unity ------------
2023/05/30 12:37:57.594 11776 11797 Info Unity UnityEngine.Events.UnityEvent:Invoke()
2023/05/30 12:37:57.594 11776 11797 Info Unity GameEvent:Raise()
2023/05/30 12:37:57.594 11776 11797 Info Unity UnityEngine.Events.UnityEvent:Invoke()
2023/05/30 12:37:57.594 11776 11797 Info Unity UnityEngine.UI.Button:OnSubmit(BaseEventData)
2023/05/30 12:37:57.594 11776 11797 Info Unity UnityEngine.EventSystems.ExecuteEvents:Execute(GameObject, BaseEventData, EventFunction`1)
2023/05/30 12:37:57.594 11776 11797 Info Unity WF_LaserPointer:OnPointerClick()
2023/05/30 12:37:57.594 11776 11797 Info Unity <ClickCycle>d__23:MoveNext()
2023/05/30 12:37:57.594 11776 11797 Info Unity UnityEngine.SetupCoroutine:InvokeMoveNext(IEnumerator, IntPtr)
2023/05/30 12:37:57.594 11776 11797 Info Unity 
2023/05/30 12:37:57.605 11776 11797 Info Unity Start
2023/05/30 12:37:57.629 11776 11893 Info ml_camera_client Camonly OnAvailable CamId = 0
2023/05/30 12:37:57.629 11776 11893 Info ml_camera_client Camonly OnAvailable CamId = 1
2023/05/30 12:37:57.630 11776 11895 Info ml_camera_client MixedRealityCamera OnAvailable
2023/05/30 12:37:57.636 11776 11902 Info ml_camera_client Camonly OnUnavailable CamId = 0
2023/05/30 12:37:57.640 11776 11797 Info Unity Camera device connected
2023/05/30 12:37:57.640 11776 11797 Info Unity MagicLeap.Examples.TestCameraRecording:ConnectCamera()
2023/05/30 12:37:57.640 11776 11797 Info Unity MagicLeap.Examples.<EnableMLCamera>d__20:MoveNext()
2023/05/30 12:37:57.640 11776 11797 Info Unity UnityEngine.SetupCoroutine:InvokeMoveNext(IEnumerator, IntPtr)
2023/05/30 12:37:57.640 11776 11797 Info Unity MagicLeap.Examples.TestCameraRecording:TryEnableMLCamera()
2023/05/30 12:37:57.640 11776 11797 Info Unity UnityEngine.XR.MagicLeap.MLPermissions:RequestPermissionInternal(String, Callbacks)
2023/05/30 12:37:57.640 11776 11797 Info Unity UnityEngine.XR.MagicLeap.MLPermissions:RequestPermission(String, Callbacks)
2023/05/30 12:37:57.640 11776 11797 Info Unity MagicLeap.Examples.TestCameraRecording:Start()
2023/05/30 12:37:57.640 11776 11797 Info Unity 
2023/05/30 12:37:57.647 11776 11797 Info Unity Camera device received stream caps
2023/05/30 12:37:57.647 11776 11797 Info Unity MagicLeap.Examples.TestCameraRecording:ConnectCamera()
2023/05/30 12:37:57.647 11776 11797 Info Unity MagicLeap.Examples.<EnableMLCamera>d__20:MoveNext()
2023/05/30 12:37:57.647 11776 11797 Info Unity UnityEngine.SetupCoroutine:InvokeMoveNext(IEnumerator, IntPtr)
2023/05/30 12:37:57.647 11776 11797 Info Unity MagicLeap.Examples.TestCameraRecording:TryEnableMLCamera()
2023/05/30 12:37:57.647 11776 11797 Info Unity UnityEngine.XR.MagicLeap.MLPermissions:RequestPermissionInternal(String, Callbacks)
2023/05/30 12:37:57.647 11776 11797 Info Unity UnityEngine.XR.MagicLeap.MLPermissions:RequestPermission(String, Callbacks)
2023/05/30 12:37:57.647 11776 11797 Info Unity MagicLeap.Examples.TestCameraRecording:Start()
2023/05/30 12:37:57.647 11776 11797 Info Unity 
2023/05/30 12:37:57.647 11776 11797 Info Unity Camera device available
2023/05/30 12:37:57.648 11776 11797 Info Unity Camera device available
2023/05/30 12:37:57.648 11776 11797 Info Unity Camera device available
2023/05/30 12:37:57.648 11776 11776 Warn UnityGfxDeviceW type=1400 audit(0.0:1758): avc: denied { search } for name="traces" dev="nvme0n1p37" ino=7340035 scontext=u:r:untrusted_app:s0:c109,c256,c512,c768 tcontext=u:object_r:trace_data_file:s0 tclass=dir permissive=0
2023/05/30 12:37:57.679 11776 11903 Info EGL-MAIN found extension DRI_Core version 2
2023/05/30 12:37:57.679 11776 11903 Info EGL-MAIN found extension DRI_IMAGE_DRIVER version 1
2023/05/30 12:37:57.679 11776 11903 Info EGL-MAIN found extension DRI_ConfigOptions version 2


Here is updated script of CameraRecording.cs file:

using System;
using System.Collections;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading.Tasks;
using MagicLeap.Core;
using UnityEngine;
using UnityEngine.UI;
using UnityEngine.XR.MagicLeap;


namespace MagicLeap.Examples
{

    /// <summary>
    /// This class handles video recording and image capturing based on controller
    /// input.
    /// </summary>
    /// 


    public class CameraRecording : MonoBehaviour
    {
        private MLCamera.CaptureFrameRate FrameRate = MLCamera.CaptureFrameRate._60FPS;
        private MLCamera.OutputFormat OutputFormat = MLCamera.OutputFormat.RGBA_8888;
        private MLCamera captureCamera;
        private bool isCapturingVideo = false;

        [SerializeField, Tooltip("Button that starts the Capture")]
        private Button captureButton;
        private bool skipFrame = false;


        [SerializeField, Tooltip("Reference to media player behavior used in camera capture playback")]
        private MLMediaPlayerBehavior mediaPlayerBehavior;

        private readonly CameraRecorder cameraRecorder = new CameraRecorder();


        private const string validFileFormat = ".mp4";

        private bool isCapturingPreview = false;
        private bool RecordToFile = true;

        private string recordedFilePath;
        private MLCamera.CaptureType CaptureType = MLCamera.CaptureType.Video;


        private List<MLCamera.StreamCapability> streamCapabilities;

        [SerializeField, Tooltip("Button that starts the Capture")]
        private readonly MLPermissions.Callbacks permissionCallbacks = new MLPermissions.Callbacks();

        private bool cameraDeviceAvailable;

        [SerializeField, Tooltip("Refrence to the Raw Video Capture Visualizer gameobject for YUV frames")]
        private CameraCaptureVisualizer cameraCaptureVisualizer = null;


        private void Awake()
        {
            permissionCallbacks.OnPermissionGranted += OnPermissionGranted;
            permissionCallbacks.OnPermissionDenied += OnPermissionDenied;
            permissionCallbacks.OnPermissionDeniedAndDontAskAgain += OnPermissionDenied;

            //connectionFlagDropdown.AddOptions(
            //    MLCamera.ConnectFlag.CamOnly,
            //    MLCamera.ConnectFlag.MR,
            //    MLCamera.ConnectFlag.VirtualOnly);

            //captureButton.onClick.AddListener(OnCaptureButtonClicked);
            //connectButton.onClick.AddListener(ConnectCamera);
            //disconnectButton.onClick.AddListener(DisconnectCamera);
            //connectionFlagDropdown.onValueChanged.AddListener(v => RefreshUI());
            //streamCapabilitiesDropdown.onValueChanged.AddListener(v => RefreshUI());
            //qualityDropDown.onValueChanged.AddListener(v => RefreshUI());
            //captureTypeDropDown.onValueChanged.AddListener(v => RefreshUI());
            //frameRateDropDown.onValueChanged.AddListener(v => RefreshUI());

            //RefreshUI();
        }

        // Start is called before the first frame update

        private void Start()
        {
            Debug.Log("Start");
            MLPermissions.RequestPermission(MLPermission.Camera, permissionCallbacks);
            MLPermissions.RequestPermission(MLPermission.RecordAudio, permissionCallbacks);

            TryEnableMLCamera();
        }

        private void TryEnableMLCamera()
        {
            if (!MLPermissions.CheckPermission(MLPermission.Camera).IsOk)
                return;

            StartCoroutine(EnableMLCamera());
        }

        private IEnumerator EnableMLCamera()
        {
            while (!cameraDeviceAvailable)
            {
                MLResult result =
                    MLCamera.GetDeviceAvailabilityStatus(MLCamera.Identifier.Main, out cameraDeviceAvailable);
                if (!(result.IsOk && cameraDeviceAvailable))
                {
                    // Wait until camera device is available
                    yield return new WaitForSeconds(1.0f);
                }
                else
                {
                    ConnectCamera();

                    // Camera device is available, start video capture here
                }
            }

            Debug.Log("Camera device available");
        }

        // Update is called once per frame
        void Update()
        {
        }


        private void OnPermissionDenied(string permission)
        {
            if (permission == MLPermission.Camera)
            {
                MLPluginLog.Error($"{permission} denied, example won't function.");
            }
            else if (permission == MLPermission.RecordAudio)
            {
                MLPluginLog.Error($"{permission} denied, audio wont be recorded in the file.");
            }

            //RefreshUI();
        }



        private void OnPermissionGranted(string permission)
        {
            MLPluginLog.Debug($"Granted {permission}.");
            TryEnableMLCamera();

            //RefreshUI();
        }

        private void StartVideoCapture()
        {
            // recordedFilePath = string.Empty;
            // skipFrame = false;

            var result = MLPermissions.CheckPermission(MLPermission.Camera);
            MLResult.DidNativeCallSucceed(result.Result, nameof(MLPermissions.RequestPermission));
            Debug.Log($"CLPermissions.CheckPermission {result}");
            if (!result.IsOk)
            {
                Debug.LogError($"{MLPermission.Camera} permission denied. Video will not be recorded.");
                return;
            }

            if (!RecordToFile)
                StartRecording();
            else
                StartPreview();
        }

        private void StartRecording()
        {
            // media player not supported in Magic Leap App Simulator
#if !UNITY_EDITOR
            mediaPlayerBehavior.MediaPlayer.OnPrepared += MediaPlayerOnOnPrepared;
            mediaPlayerBehavior.MediaPlayer.OnCompletion += MediaPlayerOnCompletion;
#endif
            string fileName = DateTime.Now.ToString("MM_dd_yyyy__HH_mm_ss") + validFileFormat;
            recordedFilePath = System.IO.Path.Combine(Application.persistentDataPath, fileName);

            CameraRecorderConfig config = CameraRecorderConfig.CreateDefault();
            config.Width = streamCapabilities[3].Width;
            config.Height = streamCapabilities[3].Height;
            config.FrameRate = MapFrameRate(MLCamera.CaptureFrameRate._60FPS);

            cameraRecorder.StartRecording(recordedFilePath, config);

            int MapFrameRate(MLCamera.CaptureFrameRate frameRate)
            {
                switch (frameRate)
                {
                    case MLCamera.CaptureFrameRate.None: return 0;
                    case MLCamera.CaptureFrameRate._15FPS: return 15;
                    case MLCamera.CaptureFrameRate._30FPS: return 30;
                    case MLCamera.CaptureFrameRate._60FPS: return 60;
                    default: return 0;
                }
            }

            MLCamera.CaptureConfig captureConfig = new MLCamera.CaptureConfig();
            captureConfig.CaptureFrameRate = FrameRate;
            captureConfig.StreamConfigs = new MLCamera.CaptureStreamConfig[1];
            captureConfig.StreamConfigs[0] = MLCamera.CaptureStreamConfig.Create(streamCapabilities[3], OutputFormat);
            captureConfig.StreamConfigs[0].Surface = cameraRecorder.MediaRecorder.InputSurface;

            MLResult result = captureCamera.PrepareCapture(captureConfig, out MLCamera.Metadata _);

            Debug.Log($"Check Camera is ready for capture {MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.PrepareCapture))}");

            if (MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.PrepareCapture)))
            {
                captureCamera.PreCaptureAEAWB();

                if (CaptureType == MLCamera.CaptureType.Video)
                {
                    result = captureCamera.CaptureVideoStart();
                    isCapturingVideo = MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.CaptureVideoStart));

                    Debug.LogError($"isCapturingVideo {isCapturingVideo} ");

                    if (isCapturingVideo)
                    {
                        cameraCaptureVisualizer.DisplayCapture(captureConfig.StreamConfigs[0].OutputFormat, RecordToFile);
                    }
                }

                if (CaptureType == MLCamera.CaptureType.Preview)
                {
                    result = captureCamera.CapturePreviewStart();
                    isCapturingPreview = MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.CapturePreviewStart));
                    if (isCapturingPreview)
                    {
                        cameraCaptureVisualizer.DisplayPreviewCapture(captureCamera.PreviewTexture, RecordToFile);
                    }
                }
            }
        }
       
        private void StartPreview()
        {
            MLCamera.CaptureConfig captureConfig = new MLCamera.CaptureConfig();
            captureConfig.CaptureFrameRate = MLCamera.CaptureFrameRate._60FPS;
            captureConfig.StreamConfigs = new MLCamera.CaptureStreamConfig[1];
            captureConfig.StreamConfigs[0] =
                MLCamera.CaptureStreamConfig.Create(streamCapabilities[3], OutputFormat);

            MLResult result = captureCamera.PrepareCapture(captureConfig, out MLCamera.Metadata _);

            if (MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.PrepareCapture)))
            {
                captureCamera.PreCaptureAEAWB();

                if (CaptureType == MLCamera.CaptureType.Video)
                {
                    result = captureCamera.CaptureVideoStart();
                    isCapturingVideo = MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.CaptureVideoStart));

                    Debug.LogError($"isCapturingVideo {isCapturingVideo} ");
                    if (isCapturingVideo)
                    {
                        cameraCaptureVisualizer.DisplayCapture(captureConfig.StreamConfigs[0].OutputFormat, true);
                    }
                }

                if (CaptureType == MLCamera.CaptureType.Preview)
                {
                    result = captureCamera.CapturePreviewStart();
                    isCapturingPreview = MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.CapturePreviewStart));
                    Debug.LogError($"isCapturingPreview {isCapturingPreview} ");
                    if (isCapturingPreview)
                    {

                        cameraCaptureVisualizer.DisplayPreviewCapture(captureCamera.PreviewTexture, true);
                    }
                }
            }
        }

        private void ConnectCamera()
        {
            MLCamera.ConnectContext context = MLCamera.ConnectContext.Create();
            context.Flags = MLCamera.ConnectFlag.CamOnly;
            context.EnableVideoStabilization = true;

            if (context.Flags != MLCamera.ConnectFlag.CamOnly)
            {
                context.MixedRealityConnectInfo = MLCamera.MRConnectInfo.Create();
                context.MixedRealityConnectInfo.MRQuality = MLCamera.MRQuality._960x720;
                context.MixedRealityConnectInfo.MRBlendType = MLCamera.MRBlendType.Additive;
                context.MixedRealityConnectInfo.FrameRate = MLCamera.CaptureFrameRate._60FPS;
            }

            captureCamera = MLCamera.CreateAndConnect(context);

            if (captureCamera != null)
            {
                Debug.Log("Camera device connected");
                if (GetImageStreamCapabilities())
                {
                    ShowToast("Camera device connected");
                    Debug.Log("Camera device received stream caps");
                    captureCamera.OnRawVideoFrameAvailable += OnCaptureRawVideoFrameAvailable;
                    captureCamera.OnRawImageAvailable += OnCaptureRawImageComplete;

                    StartVideoCapture();
                }
            }

        }

        private bool GetImageStreamCapabilities()
        {
            var result =
                captureCamera.GetStreamCapabilities(out MLCamera.StreamCapabilitiesInfo[] streamCapabilitiesInfo);

            if (!result.IsOk)
            {
                Debug.Log("Could not get Stream capabilities Info.");
                return false;
            }

            streamCapabilities = new List<MLCamera.StreamCapability>();

            for (int i = 0; i < streamCapabilitiesInfo.Length; i++)
            {
                foreach (var streamCap in streamCapabilitiesInfo[i].StreamCapabilities)
                {
                    streamCapabilities.Add(streamCap);
                }
            }

            return streamCapabilities.Count > 0;
        }

        private void MediaPlayerOnOnPrepared(MLMedia.Player mediaplayer)
        {
            // media player not supported in Magic Leap App Simulator
#if !UNITY_EDITOR
            mediaPlayerBehavior.Play();
#endif
        }

        private void MediaPlayerOnCompletion(MLMedia.Player mediaplayer)
        {
            // media player not supported in Magic Leap App Simulator
#if !UNITY_EDITOR
            mediaPlayerBehavior.StopMLMediaPlayer();
#endif
            mediaPlayerBehavior.gameObject.SetActive(false);
            mediaPlayerBehavior.Reset();
        }

        private void OnCaptureRawVideoFrameAvailable(MLCamera.CameraOutput capturedFrame, MLCamera.ResultExtras resultExtras, MLCamera.Metadata metadataHandle)
        {
            // if (string.IsNullOrEmpty(captureInfoText.text) && isCapturingVideo)
            // {
            // captureInfoText.text = capturedFrame.ToString();
            // }



            if (OutputFormat == MLCamera.OutputFormat.RGBA_8888 && FrameRate == MLCamera.CaptureFrameRate._30FPS && streamCapabilities[0].Width >= 4096)
            {
                // cameraCaptureVisualizer cannot handle throughput of RGBA_8888 4096x3072 at 30 fps 
                skipFrame = !skipFrame;
                if (skipFrame)
                {
                    return;
                }
            }
            cameraCaptureVisualizer.OnCaptureDataReceived(resultExtras, capturedFrame);
        }

        /// <summary>
        /// Handles the event of a new image getting captured.
        /// </summary>
        /// <param name="capturedImage">Captured frame.</param>
        /// <param name="resultExtras">Results Extras.</param>
        private void OnCaptureRawImageComplete(MLCamera.CameraOutput capturedImage, MLCamera.ResultExtras resultExtras, MLCamera.Metadata metadataHandle)
        {

            // isDisplayingImage = true;
            cameraCaptureVisualizer.OnCaptureDataReceived(resultExtras, capturedImage);

            if (RecordToFile)
            {
                if (capturedImage.Format != MLCamera.OutputFormat.YUV_420_888)
                {
                    string fileName = DateTime.Now.ToString("MM_dd_yyyy__HH_mm_ss") + ".jpg";
                    recordedFilePath = System.IO.Path.Combine(Application.persistentDataPath, fileName);
                    try
                    {
                        File.WriteAllBytes(recordedFilePath, capturedImage.Planes[0].Data);
                        // captureInfoText.text += $"\nSaved to {recordedFilePath}";
                    }
                    catch (Exception e)
                    {
                        Debug.LogError(e.Message);
                    }
                }
            }
        }
        public void ShowToast(string message)
        {
            if (Application.platform == RuntimePlatform.Android)
            {
                // Retrieve the UnityPlayer class
                AndroidJavaClass unityPlayerClass = new AndroidJavaClass("com.unity3d.player.UnityPlayer");

                // Retrieve the current activity from the UnityPlayer class
                AndroidJavaObject currentActivity = unityPlayerClass.GetStatic<AndroidJavaObject>("currentActivity");

                // Show the toast message
                currentActivity.Call("runOnUiThread", new AndroidJavaRunnable(() =>
                {
                    // Retrieve the Toast class
                    AndroidJavaClass toastClass = new AndroidJavaClass("android.widget.Toast");

                    // Create the Toast object
                    AndroidJavaObject toastObject = toastClass.CallStatic<AndroidJavaObject>("makeText", currentActivity, message, 0);

                    // Show the Toast
                    toastObject.Call("show");
                }));
            }
            else
            {
                Debug.Log("Toast message: " + message);
            }
        }


    }

}

Hi @usman.bashir

We appreciate you reaching out to us regarding this issue. I have reached out to our developers and I will report back to you once we learn more about why the camera's data is not being displayed on the Canvas Background.

I would also like to ask what for a few more details:

  • Unity Editor version
  • ML2 OS Version
  • ML SDK Version
  • Development OS Platform

Thanks,

El

Thanks @etucker .

Here are the details:

  • Unity Editor version: 2022.2.0b16
  • ML2 OS version: Version 1.2.0, Build B3E.230330.11-R.044
  • MLSDK version: v1.6.1
  • Host OS: Windows 11

Kind regards,

Thank you for the info. We are working on this and I will report back as soon as we have a lead on this issue.

Hello @usman.bashir,

We believe that the NullReferenceException may be causing some issues with the API calls, resulting in some InvalidParam results. We suggest running our camera capture example and compare the code to see what may be missing.

Thanks,

El

If you have visited previous threads of same question. I have implemented kind of same code stack based on your examples. I am simply not using any kind of drop down menus or buttons from your example, but code is almost same.

We would also suggest updating to the latest MLSDK/Unity Editor/OS and checking to see if that resolves the error.

Thanks

El

@etucker

Unity editor installed for MagicLeapExamples is: 2022.2.0f1

Unity editor on which I am working on my project is: 2022.2.0b16

Do you want me to use the same editor for my project which is being used for MagicLeapExamples?

@usman.bashir

We recommend using the latest 2022.2.xfx version of the editor available (2022.2.20f1 or later)

And the SDK Examples should come from the latest version as well (1.7.0)

@usman.bashir

We have some new discoveries regarding your code. The primary issue that we found had to do with the format in which you did a preview capture.

The proper format to use should be YUV when doing a capture.

Here is your proposed code that we have edited and it should no longer throw these errors when creating a preview camera.

using System.Collections;
using System.Collections.Generic;
using MagicLeap.Core;
using UnityEngine;
using UnityEngine.XR.MagicLeap;


namespace MagicLeap.Examples
{

    /// <summary>
    /// This class handles video recording and image capturing based on controller
    /// input.
    /// </summary>
    /// 


    public class CameraRecording : MonoBehaviour
    {
        private MLCamera.CaptureFrameRate FrameRate = MLCamera.CaptureFrameRate._60FPS;
        private MLCamera.OutputFormat OutputFormat = MLCamera.OutputFormat.YUV_420_888;
        private MLCamera captureCamera;
        private bool isCapturingVideo = false;

        private bool skipFrame = false;


        [SerializeField, Tooltip("Reference to media player behavior used in camera capture playback")]
        private MLMediaPlayerBehavior mediaPlayerBehavior;




        private bool isCapturingPreview;

        private MLCamera.CaptureType CaptureType = MLCamera.CaptureType.Preview;


        private List<MLCamera.StreamCapability> streamCapabilities;

        private readonly MLPermissions.Callbacks permissionCallbacks = new MLPermissions.Callbacks();

        private bool cameraDeviceAvailable;

        [SerializeField, Tooltip("Refrence to the Raw Video Capture Visualizer gameobject for YUV frames")]
        private CameraCaptureVisualizer cameraCaptureVisualizer = null;


        private void Awake()
        {
            permissionCallbacks.OnPermissionGranted += OnPermissionGranted;
            permissionCallbacks.OnPermissionDenied += OnPermissionDenied;
            permissionCallbacks.OnPermissionDeniedAndDontAskAgain += OnPermissionDenied;
        }

        // Start is called before the first frame update

        private void Start()
        {
            MLPermissions.RequestPermission(MLPermission.Camera, permissionCallbacks);
            MLPermissions.RequestPermission(MLPermission.RecordAudio, permissionCallbacks);
            TryEnableMLCamera();
        }

        private void TryEnableMLCamera()
        {
            if (!MLPermissions.CheckPermission(MLPermission.Camera).IsOk)
                return;

            StartCoroutine(EnableMLCamera());
        }

        private IEnumerator EnableMLCamera()
        {
            while (!cameraDeviceAvailable)
            {
                MLResult result =
                    MLCamera.GetDeviceAvailabilityStatus(MLCamera.Identifier.Main, out cameraDeviceAvailable);
                if (!(result.IsOk && cameraDeviceAvailable))
                {
                    // Wait until camera device is available
                    yield return new WaitForSeconds(1.0f);
                }
            }

            yield return new WaitForEndOfFrame();
            ConnectCamera();
            Debug.Log("Camera device available");
        }

        private void OnPermissionDenied(string permission)
        {
            if (permission == MLPermission.Camera)
            {
                MLPluginLog.Error($"{permission} denied, example won't function.");
            }
            else if (permission == MLPermission.RecordAudio)
            {
                MLPluginLog.Error($"{permission} denied, audio wont be recorded in the file.");
            }
        }

        private void OnPermissionGranted(string permission)
        {
            MLPluginLog.Debug($"Granted {permission}.");
        }

        private void StartVideoCapture()
        {
            skipFrame = false;
            var result = MLPermissions.CheckPermission(MLPermission.Camera);
            MLResult.DidNativeCallSucceed(result.Result, nameof(MLPermissions.RequestPermission));
            MLPluginLog.Debug($"CLPermissions.CheckPermission {result}");
            if (!result.IsOk)
            {
                Debug.LogError($"{MLPermission.Camera} permission denied. Video will not be recorded.");
                return;
            }
            StartPreview();
        }
        
        private void StartPreview()
        {
            MLCamera.CaptureConfig captureConfig = new MLCamera.CaptureConfig();
            captureConfig.CaptureFrameRate = FrameRate;
            captureConfig.StreamConfigs = new MLCamera.CaptureStreamConfig[1];
            captureConfig.StreamConfigs[0] =
                MLCamera.CaptureStreamConfig.Create(streamCapabilities[3], MLCameraBase.OutputFormat.YUV_420_888);

            MLResult result = captureCamera.PrepareCapture(captureConfig, out MLCamera.Metadata _);

            if (MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.PrepareCapture)))
            {
                result = captureCamera.CapturePreviewStart();
                isCapturingPreview =
                    MLResult.DidNativeCallSucceed(result.Result, nameof(captureCamera.CapturePreviewStart));
                if (isCapturingPreview)
                {
                    cameraCaptureVisualizer.DisplayPreviewCapture(captureCamera.PreviewTexture, false);
                }
            }
        }

        private void ConnectCamera()
        {
            MLCamera.ConnectContext context = MLCamera.ConnectContext.Create();
            context.Flags = MLCamera.ConnectFlag.CamOnly;
            context.EnableVideoStabilization = true;

            captureCamera = MLCamera.CreateAndConnect(context);

            if (captureCamera != null)
            {
                print("Camera device connected");
                if (GetImageStreamCapabilities())
                {
                    print("Camera device received stream caps");
                    captureCamera.OnRawVideoFrameAvailable += OnCaptureRawVideoFrameAvailable;
                    captureCamera.OnRawImageAvailable += OnCaptureRawImageComplete;

                    StartVideoCapture();
                }
            }

        }
        private bool GetImageStreamCapabilities()
        {
            var result =
                captureCamera.GetStreamCapabilities(out MLCamera.StreamCapabilitiesInfo[] streamCapabilitiesInfo);

            if (!result.IsOk)
            {
                Debug.Log("Could not get Stream capabilities Info.");
                return false;
            }

            streamCapabilities = new List<MLCamera.StreamCapability>();

            for (int i = 0; i < streamCapabilitiesInfo.Length; i++)
            {
                foreach (var streamCap in streamCapabilitiesInfo[i].StreamCapabilities)
                {
                    streamCapabilities.Add(streamCap);
                }
            }

            return streamCapabilities.Count > 0;
        }
        private void OnCaptureRawVideoFrameAvailable(MLCamera.CameraOutput capturedFrame, MLCamera.ResultExtras resultExtras, MLCamera.Metadata metadataHandle)
        {
            cameraCaptureVisualizer.OnCaptureDataReceived(resultExtras, capturedFrame);
        }

        /// <summary>
        /// Handles the event of a new image getting captured.
        /// </summary>
        /// <param name="capturedImage">Captured frame.</param>
        /// <param name="resultExtras">Results Extras.</param>
        private void OnCaptureRawImageComplete(MLCamera.CameraOutput capturedImage, MLCamera.ResultExtras resultExtras, MLCamera.Metadata metadataHandle)
        {
            cameraCaptureVisualizer.OnCaptureDataReceived(resultExtras, capturedImage);
        }
    }

}

Thanks,

El

I am testing your provided code stack.

Thank you so much @etucker .

Can we merge the following function or shift our communication to the following question?

This will help us to direct our research related queries to only single thread. Apologies for inconvenience.

Kind regards,