Locating ML2 physical camera in OpenXR

@kbabilinski,

It looks like the exception thrown in GetClosestTimeStampData() that I reported in my previous post was already seen before [1].

Based on your comments out there, my app doesn't run camera at 60 but 30FPS, so probably not the reason of too old timestamps. However, having to use both the Perception system and OpenXR runtime to locate the camera by multiplying the pose returned by xrLocateSpace by the one returned by MLCVCameraGetFramePose as you detailed previously can be seen as a serious resources usage. I wish I could use OpenXR on its own (my initial objective) but it seems that interop with ML SDK is needed for what I'm trying to achieve.

So from now, the only way I've found to accurately locate the camera without having an exception thrown in GetClosestTimeStampData() is to record the system monotonic clock in OnVideoAvailable callback (rather than using the camera timestamp in MLCameraResultExtras::vcam_timestamp), convert it to MLTime with MLTimeConvertSystemTimeToMLTime and pass it to MLCVCameraGetFramePose to get the camera pose, without relying upon OpenXR's xrLocateSpace at all, which is suboptimal. This is kinda working: the returned camera locations are accurate but the location updates are really laggy. I know this may come from erroneous (i.e. too new) timestamps recorded in OnVideoAvailable callback rather than at exposure time in MLCameraResultExtras::vcam_timestamp for display prediction, but it's the only solution I've found so far. And I'm pretty sure this is not the right solution...

[1] Crash on GetClosestTimeStampData())