How can I convert YUV_420_888 to JPG in C#

I want to capture images using Magic Leap while watching the device stream. To achieve this, I've implemented a CV camera, which I've learned captures in YUV format. Could someone assist me with converting these images to JPG? Is there a package or a Stack Overflow solution I might have overlooked? All the solutions I've come across so far are written in Java.

Note, the Magic Leap 2 can provide both RGB and YUV images from the camera. However, if you need to use YUV format, you will first need to convert the image into RGB before saving it as a JPG. At a high level the steps would be:

  1. Obtain the camera Frame in YUV format
  2. Convert YUV into RGB (simple example : Visualize Camera Output | MagicLeap Developer Documentation)
  3. Use a 3rd party library or unity's built in function to save the texture as a jpeg (Unity - Scripting API: ImageConversion.EncodeToJPG)

I tried to capture in RGB image but then I can't use device stream while capturing. For my project it is important to be able to constantly follow the application flow on the device stream

Both the CV and Main camera suppot RGB format when using the MLCamera API. MLCamera (Deprecated) | MagicLeap Developer Documentation

The device stream should work properly as long as you do not block the Main Camera stream, and only use the CV Camera stream.

1 Like

As I said I tried with RGB but it doesn't allow me to capture with CV while streaming, but when I use YUV it does. I found this answer and that's why I'm trying with this approach:

In meantime I managed to get black and white jpeg image using following code:

public static byte[] ConvertYUVToJPEG(byte[] yPlane, byte[] uPlane, byte[] vPlane, int width, int height)
{
    Color32[] rgbData = new Color32[width * height];
    for (int y = 0; y < height; y++)
    {
        for (int x = 0; x < width; x++)
        {
            int yIndex = y * width + x;
            byte Y = yPlane[yIndex];
            // Convert Y to RGB (black and white)
            byte R = Y;
            byte G = Y;
            byte B = Y;
            rgbData[yIndex] = new Color32(R, G, B, 255);
        }
    }

    Texture2D texture = new Texture2D(width, height, TextureFormat.RGBA32, false);
    texture.SetPixels32(rgbData);
    texture.Apply();

    //FlipTextureVertically(texture);

    byte[] jpegData = texture.EncodeToJPG();

    Debug.Log("JPEG image saved at output.jpg");

    return jpegData;
}

Can you assist me with adapting this code for getting the colored image?

Here is a link on how to do this: Visualize Camera Output | MagicLeap Developer Documentation

It includes information on how to convert yuv into rgb.

The YUV image is provided and separate planes Y,U, and V. The example that I sent uses a shader to combine them into RGB and then writes it to a texture.

That said we will try to reproduce the issue that you’re mentioning: not being able to capture an RGB image when device stream is active.

—-

To capture the image, are you using the script on the developer portal or have you created a custom script?