@etucker :
For the second point:
I tried to use the ConvertMLTimeToSystemTime method, but for me it seems, that it does not return a common "system" time. Let's see result from a recordings:
TimeStampSys ElapsedTSSeconds FrameType FrameNumber
191901473814 191,901473814 DepthShortRange 169
400564877422 200,282438711 WorldNormalRight 468
191865591241 191,865591241 CvVideo 205
400564877950 200,282438975 WorldNormalLeft 468
400564877810 200,282438905 WorldNormalCente 468
191941472628 191,941472628 DepthShortRange 170
191932267410 191,93226741 CvVideo 206
191981473928 191,981473928 DepthShortRange 171
191998928693 191,998928693 CvVideo 207
192021473486 192,021473486 DepthShortRange 172
400684877816 200,342438908 WorldNormalCente 471
400684877891 200,3424389455 WorldNormalRight 471
400684877825 200,3424389125 WorldNormalLeft 471
192061472488 192,061472488 DepthShortRange 173
192065604554 192,065604554 CvVideo 208
192101473610 192,10147361 DepthShortRange 174
400764876805 200,3824384025 WorldNormalLeft 473
400764879484 200,382439742 WorldNormalRight 473
400764876841 200,3824384205 WorldNormalCente 473
192141473692 192,141473692 DepthShortRange 175
192132263060 192,13226306 CvVideo 209
192181472653 192,181472653 DepthShortRange 176
192198930657 192,198930657 CvVideo 210
400884877325 200,4424386625 WorldNormalLeft 476
The TimeStampSys values are calcualted by calling ConvertMLTimeToSystemTime .Investigating an hour long recordings, from the timestamp differences belonging to same framtypes we see that worldcamera systemtime tick is half-nanosec, the other cameras systemtime tick is 1 nanosec.
Using this rule, I calcualted the second column:ElapsedTSSeconds.
From this we can see that there are 8-9 seconds differences between different type of framtimestamps which follows each-other.
On the sceen we recorded , there was a stopper; by this we checked that the recordings does not contain time-shifts, so the differences are coming from the timestamps only.
Based on this we guess that the starting point (0 nanosec) for each sensor can be different.
So the question is:
After having the TimeStampSys values for each frames, how can we get for one frame all the other type of frames which are the most closer in real-time.
Can we calcualte the real-time differences between timestamps if the frametypes are different?
For your 1st question:
e.g. in case of CV camera it is written, than poses are cached and frametimestamps is used for geting the pose.
But how?
It coud be the key in the cache, or somehow this timestamp is used for geting a global time which is used for selecting one "nearest in time" pose or calculating one pose from the previous and next in the time poses, by using speed vector, acceleration etc.