A couple of things you could try:
Option 1: Pixel Sensor API (Recommended)
Have you looked into using the OpenXR Pixel Sensor API? It's the preferred way to access sensor data these days and plays nicely with OpenXR.
Option 2: Legacy MLSDK
If you're sticking with the older way, you may want to try using the legacy MLSDK workflow.
Testing if Unbounded Space is being set properly in Unity
To test if the unbounded space is being set properly in Unity, follow these steps:
- Create a new scene.
- Go to the Package Manager and import the XR Rig package from the Magic Leap SDK.
- In the inspector, set the XR Origin to "device".
- Place a 3D cube into your Unity scene, so that it can be viewed by the main camera
- Run the application and see if the virtual content shifts when you switch the tracking to unbounded.
Note: Both Perception Snapshots setting and the Reference Space feature need to be enabled in your Project's OpenXR Settings.
Some background information based on the MLCamera:
The MLCVCameraGetFramePose
gives you the position of the camera in world origin
coordinate system which is not a concept in OpenXR. However it happens to align with our implementation of msft_unbounded_ref_space
so we take advantage of that to allow developers to express the position of the pose in OpenXR as they choose.