Rgb framerate is ignored and 0 timestamp on 25% of rgb frames

Unity Editor version: 2022.3.42f1
ML2 OS version: 1.12
MLSDK version: 1.12

Error messages from logs: no erorrs, but having invalid data and unexpected behaviour

We use OpenXR for capturing depth and rgb frames
we set framerates 5 FPS for depth and 30 FPS for rgb.
on OS level we set 50Hz for local power freq.

We see the next unexpected results:

  1. Earlier with this setup only 25/50 FPS was allowed for rgb, but now only 30,31,..,60 FPS is allaowed by OpenXr on 50Hz and on 60Hz of local power freq.
  2. The method wich sets the rgb FPS returns with succes , but from the framnumbers we can see that 50 frames are generated in a second. So it seems that the capturing is done on 50Hz instead of the 30 we set. (we request for a frame in every 1/10th second.)
  3. Around 25% of the rgb frames have 0 timestamp value, so we can not use them with the depth so have to drop them. (usually the next frame we requested has valid timestamp)

Are you obtaining the RGB frames using the pixel sensor or are you using the standard Android Camera APIs? You may find that using the Android Camera API directly will be more efficient to capture the RGB camera frames.

We are using
QueryPixelSensorCapability()
ApplySensorConfig()
and GetSensorData(..)
from MagicLeapPixelSensorFeature

For the RGB camera, I would recommend using the Android Camera APIs directly and obtain the depth camera frames using the pixel sensor.

Do you know what framerate your application is running at?

We Use shortrange depth on 5 FPS and request for depth frames with timing to reach 5FPS
and try to sync RGB frame requests with depth frame capture time, wich leads to 5 to 10 FPS on RGB.

Sorry let me clarify, how fast does your application run? What is the framerate of your application. Do you see noticeable lag in the application? You can use the Developer Hud to identify this:

After we modified the timing logic for getting RGB frames, it stabilized the things.
Now we do not want to get RGB frame exactly at the same time when the depth frame is captured, but only near to the closest previous/next RGB frame capture time (calculated from first frame capture time and framerate)
With this thechnically we got the same frame just a bit earlier or later than the depth frame is received.

This topic was automatically closed 15 days after the last reply. New replies are no longer allowed.