Give us as much detail as possible regarding the issue you're experiencing.
ML2 OS version: 1.4 MLSDK version: 1.6.0 Host OS: (Windows/MacOS) Windows 10
Error messages from logs (syntax-highlighting is supported via Markdown):
I am working on an app that is based on the camera preview sample and uses the depth camera to filter the RGB camera feed and display it. This is done by accessing the RGB camera data pointer, looping through it and setting certain pixels to black (0,0,0,255) if they must be filtered based on depth readings at the corresponding location in the depth image.
We are seeing that blue lines appear at regular locations in the image, at the same locations in each frame, and only in the regions of the image that have been filtered by setting them to black. We have analyzed the RGB frames after our processing and there do not appear to be any blue pixels, but when we input the frame into GLTexImage2D to display the image, they appear.
Let me know if any further information on the code implementation is needed to answer this question.
How are you storing the processed RGB data after filtering? Is it in a simple byte array or something more complex?
We are simply overwriting values for some pixels in the original RGBA_8888 image in order to filter the image. This is the original row-major format with 4 bytes per pixel.
If possible, please share the relevant sections of code where you create the OpenGL texture and the part where you do the depth-based filtering.
I have not created an OpenGL texture, but have modified the existing RGBA image. This has been done like this:
uint8_t *ptr;
ptr = (uint8_t *)data; // data is the RGB data structure that already existing in the camera preview sample
pixel_to_change is then defined as the current pixel in the row-major image format as we iterate through the pixels in the image, and pixel_index is defined this way to be the index in the uint8_t ptr:
int pixel_index = (pixel_to_change)*4;
Depth filtering is then done by changing pixels to black if they are above a depth threshold:
if (depthVal > depth_threshold)
{
ptr[pixel_index] = 0; // red channel
ptr[pixel_index+1] = 0; // green channel
ptr[pixel_index+2] = 0; // blue channel
}
We were able to remove the blue lines in the filtered parts of our image that we set to black. This was done by setting the alpha (opacity) values of all of the filtered pixels to 0.
However, in parts of the image that are filtered but are not set to black, this fix is not working. When the alpha values are set to 0, the lines turn red, and when the values are set to 255 the lines turn blue again. Values between 0 and 255 are predictably on a spectrum from red to blue.
We also attempted rebuilding and displaying the image in RGB, rather than RGBA, and this again shows the horizontal blue lines but this time on the entire image (assumedly because the image has been reconstructed to into a new data structure in order to convert it to RGB).
Thank you for this information. I will help us track down the cause of the issue. Do you mind sharing example images? Is the depth data that you are receiving correct and it just doesn’t display correctly or is the data also incorrect? Which stream are you trying to visualize?