RBG and Depth frame pose difference increasing with the distance from Origin?

Give us as much detail as possible regarding the issue you’re experiencing:

Unity Editor version: 2022.3.42.f1
ML2 OS version: 1.12.0
Unity SDK version: 2.6
Host OS: Windows 11

Error messages from logs (syntax-highlighting is supported via Markdown): -

We use the suggested SDK modification in the
Hologram Drift Issue When Tracking Object with Retroreflective Markers using Depth Camera (Raw data) - OpenXR - Magic Leap 2 Developer Forums
topic to get capture timestamp for rgb and dept frames.
With this we can pair rgb and depth frames within 10 ms capture time difference. So we have the closest pairs.

But if we check the posees got for the rgb and depth frame we realize that there are larger and larger differences as we move further from the origin/starting point.
On the next charts you can see as we move around a box 2 times. first continously, second stopping for a while several times.
The first chart shows the dfferences the secondo one shows the absolut position values in mm.
On he first chart also displaced the capturetime difference in ms.
We cheched it on multiple devices. it shows almost the same difference on different devices too.
What could couse this? How could we avoid this.

Aditional info: we scanned the a local space before the tests.

Thank you for the detailed graph.

Are you obtaining the RGB image using the MLCamera API to obtain the Camera Images and the Camera pose?

Also to make sure that I understand, the main issue you are seeing is that the origin pose for the two frames is inconsistent as you move around the environment?

We use
MagicLeap.OpenXR.Features.PixelSensors.MagicLeapPixelSensorFeature.GetSensorData(PixelSensorId sensorType, uint streamIndex, out PixelSensorFrame frame, out PixelSensorMetaData metaData, Allocator allocator, long timeOut = 10, bool shouldFlipTexture = true).

For getting pose, we use the custom GetSensorPose, You wrote in
Hologram Drift Issue When Tracking Object with Retroreflective Markers using Depth Camera (Raw data) - OpenXR - Magic Leap 2 Developer Forums

And what we se is, that if we get rgb and depth frames which has around 0.05 sec capturetime difference only, and we capture frames even with a “still” headset, the pose (x,y,z) coordinates are much higher if we capture frames 1 meter away from the origin than when we capture frame closer to origin.

On the second graph you can see the distance from origin.
on the first you can see the error between the rgb and depth frame poses.
The correlation between the distance-from-origin and the erorr-between-rgb-depth-poses can be seen easily.

on the first graph I added remarks to section (on the top of the graph) to describe the action we took meanwhile the frames recorded.

I’m not sure I understand the question fully.

The get sensor pose is the 3D pose of the sensor relative to the origin of the application. The pose is not relative to the headset itself but rather a tracked point in the scene. The values would appear to grow on both of the poses because you are moving further away from the origin (0,0,0)

The origin (0,0,0) is set when the device is turned on.
We capture RGB and Depth frames too.

We expect that: if we capture RGB and Depth frames at the same moment than the pose we get for the RGB frame should be very close to the pose we get for the Depth frame.

Based on our measures. It seems that, the difference (error) between the RGB pose and the closest Dept frame pose correlates with the distance from the origin.

So if we are close to the origin, thant the difference is small,
but if we move 1 meter avay from the origin, than the error is 40-50 mm on at least one direction (x,y, or z) (this comaprision does not include the orientation differences, but it is importatnt too)
It is almost the same if the device is still or moving at the moment of capturing.

When we go back near to the origin, than the error is samll again.

When getting the pose we pass the sensorId , and the captureTime
pose = pixelSensorFeature.GetSensorPose(sensorId, frame.CaptureTime)
the method we call is modified , based on your sample code in

public Pose GetSensorPose(PixelSensorId sensorType, long captureTime, Pose offset = default)

To see my problem a bit easier, From the 3 coordinate-difference values I calcualted the distace between the RGB and the Dept sensors , which should be fix.
But you can see, that it changes too much as we move in the space

Interesting. Are you setting the tracked space to unbounded before obtaining the calculation?

It seems finally I got to the solution. now the mesured differences are some mm precise whne changin pose or orientation.

thank you.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.