Good afternoon everyone,
I am developing an application for Magic Leap 2 that tracks an object equipped with retroreflective markers and overlays its holographic counterpart on the real object in real-time.
Unity Editor Version: 2022.3.61f1
ML2 OS Version: 1.12.0
MLSDK Version: 1.12.0.
As you can see in the video, the tracking appears to be correct, and I am able to compute the pose of the object in the world reference frame. However, when I move my head, the hologram seems to drift — following the head movement briefly before realigning with the tracked object.
To compute the object’s pose in the world reference frame (i.e., the XROrigin, corresponding to the head’s initial position when the app launches), I perform the following matrix multiplication:
csharp
Matrix4x4 worldTobject = worldTsensor * sensorTobject;
where worldTsensor
is calculated like this:
csharp
Pose offset = new Pose(
xrOrigin.CameraFloorOffsetObject.transform.position,
xrOrigin.transform.rotation
);
Pose sensorPose = pixelSensorFeature.GetSensorPose(sensorId.Value, offset);
and sensorTobject
is the pose of the tracked object relative to the sensor reference frame (solved using PnP algorithm and pinhole camera model).
I don’t understand why the hologram drifts even though everything is referred to the XROrigin, which should remain fixed.
Am I missing an additional transformation (e.g., head pose)? Or are the sensor pose and the depth camera frame not synchronous? I get the sensor pose just before processing the depth frame as suggested in the pixel-sensors API examples:
csharp
Pose sensorPose = pixelSensorFeature.GetSensorPose(sensorId.Value, offset);
if (pixelSensorFeature.GetSensorData(sensorId.Value, stream, out var frame, out var metaData,
Allocator.Temp, shouldFlipTexture: true))
{
// Process Frames ...
Debug.Log("Sensor Pose:" + sensorPose);
streamVisualizer.ProcessFrame(frame, metaData, sensorPose);
}
Any help or advice would be greatly appreciated!