Setting up the MLDepthcamera on Unity SDK 2.0.0?

Give us as much detail as possible regarding the issue you're experiencing:

Unity Editor version: 2022.3.17f1
ML2 OS version:1.5.0
Unity SDK version:2.0.0
Host OS: Windows

Hi, I can't find any examples on how to enable the MLDepthCamera with the new Unity SDK. The documentation is quite limited on how to set the settings. How do I set the CaptureFlags, the Framerate and the list StreamConfig[]? Next to this, where do I find the appropriate timeout setting for the function GetLatestDepthData? Or do I just wing it? I currently have 'depthResult = MLDepthCamera.GetLatestDepthData(5000, out MLDepthCamera.Data DepthData); '

Thanks in advance!

I recommend downloading the 1.12.0 Unity SDK examples. This example project demonstrates how to access the depth camera. Note that if you want to get the depth point from the image returned by the Depth camera you will need to combine the distance image and the depth flags to get the depth pixel's validity.

You will also have to undistort the image (pin hole camera) using the intrinsic values and then convert the distance (range) pixel into a depth depth pixel using. Somthing like

point = normalize([(x-cx)/fx, (y-cy)/fy]) * distance

float fx = intrinsics.focal_point_x;
float fy = intrinsics.focal_point_y;
float cx = intrinsics.principal_point_x;
float cy = intrinsics.principal_point_y;

1 Like

Thank you, I got it to work! Now I just have another question. I'd like to stream this or download the depth image to a desktop with Magicleap WebRTC. Th examples just show how to do that with the MLCamera. This does not seem to be applicable to the depth sensor, since the connectContext property does not exist for the MLDepthCamera, and neither does the CreateAndConnectAsync() function. Do you have any advice on how to set up a stream for the MLDepthCamera? Thanks!

I recommend using the Unity WebRTC plugin as the Magic Leap WebRTC APIs are deprecated and will be removed in a future version of the SDK. The Unity WebRTC plugin provides a more flexible API that allows you to stream custom images more easily. That said, the Unity Community or the Unity WebRTC Github page might be a better place to get details on how to stream a custom texture.