World Coordinate for depth camera

Unity Editor version: 2022.3.21f1
ML2 OS version: 1.7.0
Unity SDK version: 2.2.0
Host OS: Windows

Hello,

I am trying to make a demo that renders real-time point clouds overlaid with the real object using the depth camera of ML2. @kbabilinski provided a very useful example script in a previous post: Processing the depth frames. I basically just added a renderer to the points in the world coordinate, which were transformed using the cameraToWorldMatrix from the camera coordinate. But it looks like I'm still missing one coordinate transformation to overlay the points on the corresponding objects as seen from the lenses. Could anyone help me with this? I attached a screenshot of my program: the point cloud was reconstructed correctly but not displayed in the right place.

Thank you

Happy to hear that the post helped you. :grin:

I assume you are using openxr in your project. If this is true, the issue you are running into is that the origin for the sensor is not the same as the origin for your application. To make sure the two align you will need to set the tracking origin to “unbounded” .

This setting is not visible when viewing the XR Origin’s in the inspector, and requires the Magic Leap Reference Space feature to be enabled in your OpenXR feature settings . so you will need to use a script. See the example below.


using System;
using System.Collections;
using System.Collections.Generic;
using System.Linq;
using UnityEngine;
using UnityEngine.XR;
using UnityEngine.XR.Management;
using UnityEngine.XR.OpenXR;
using UnityEngine.XR.OpenXR.Features.MagicLeapSupport;

public class ReferenceSpaceToggle : MonoBehaviour
{
    private bool inputSubsystemValid;
    private XRInputSubsystem inputSubsystem;

    // Start is called before the first frame update
    IEnumerator Start()
    {

       var referenceSpaceFeature = OpenXRSettings.Instance.GetFeature<MagicLeapReferenceSpacesFeature>();
       if (!referenceSpaceFeature.enabled)
       {
           Debug.LogError("Unbounded Tracking Space cannot be set if the OpenXR Magic Leap Reference Spaces Feature is not enabled. Stopping Script.");
           yield break;
       }

       yield return new WaitUntil(() => XRGeneralSettings.Instance != null &&
                                         XRGeneralSettings.Instance.Manager != null &&
                                         XRGeneralSettings.Instance.Manager.activeLoader != null &&
                                         XRGeneralSettings.Instance.Manager.activeLoader.GetLoadedSubsystem<XRInputSubsystem>() != null);
       
       inputSubsystem = XRGeneralSettings.Instance.Manager.activeLoader.GetLoadedSubsystem<XRInputSubsystem>();
       TrackingOriginModeFlags supportedModes = inputSubsystem.GetSupportedTrackingOriginModes();
       
       string supportedSpaces = string.Join("\n",
            ((TrackingOriginModeFlags[])Enum.GetValues(typeof(TrackingOriginModeFlags))).Where((flag) =>
                supportedModes.HasFlag(flag) && flag != TrackingOriginModeFlags.Unknown));
       Debug.Log($"Supported Spaces:{supportedSpaces}");
       
       string currentSpace = inputSubsystem.GetTrackingOriginMode().ToString();
       Debug.Log($"Current Space:{currentSpace}");
       
       inputSubsystemValid = true;

       SetSpace(TrackingOriginModeFlags.Unbounded);
    }

    public void SetSpace(TrackingOriginModeFlags flag)
    {
        if (inputSubsystemValid)
        {
            if (inputSubsystem.TrySetTrackingOriginMode(flag))
            {
                string currentSpace = inputSubsystem.GetTrackingOriginMode().ToString();
                Debug.Log($"Current Space:{currentSpace}");
                inputSubsystem.TryRecenter();
                return;
            }
        }
        Debug.LogError("SetSpace failed to set Tracking Mode Origin to " + flag.ToString());
    }
}

Alternatively, you can use the new (experimental) OpenXR Pixel Sensor API in unity. Note, we are still refining the API and documentation, however the API can be used without changing origin modes.

Thank you! It's working now :slight_smile:

Actually I have one more question. Although the openxr and sensor origins look aligned now, I found an offset between two lenses. In other words, the points seen from the left lens are ~1 inch away from the right lens. Could that be due to the tracking origin change I just made?

Does this mean that the points appear correctly in the right lens?
Are you using a custom shader for the point cloud? Make sure that the shader supports stereoscopic rendering.

This topic was automatically closed 15 days after the last reply. New replies are no longer allowed.