How can I convert YUV_420_888 to JPG in C#

I want to capture images using Magic Leap while watching the device stream. To achieve this, I've implemented a CV camera, which I've learned captures in YUV format. Could someone assist me with converting these images to JPG? Is there a package or a Stack Overflow solution I might have overlooked? All the solutions I've come across so far are written in Java.

Note, the Magic Leap 2 can provide both RGB and YUV images from the camera. However, if you need to use YUV format, you will first need to convert the image into RGB before saving it as a JPG. At a high level the steps would be:

  1. Obtain the camera Frame in YUV format
  2. Convert YUV into RGB (simple example : Visualize Camera Output | MagicLeap Developer Documentation)
  3. Use a 3rd party library or unity's built in function to save the texture as a jpeg (Unity - Scripting API: ImageConversion.EncodeToJPG)

I tried to capture in RGB image but then I can't use device stream while capturing. For my project it is important to be able to constantly follow the application flow on the device stream

Both the CV and Main camera suppot RGB format when using the MLCamera API. MLCamera (Deprecated) | MagicLeap Developer Documentation

The device stream should work properly as long as you do not block the Main Camera stream, and only use the CV Camera stream.

1 Like

As I said I tried with RGB but it doesn't allow me to capture with CV while streaming, but when I use YUV it does. I found this answer and that's why I'm trying with this approach:

In meantime I managed to get black and white jpeg image using following code:

public static byte[] ConvertYUVToJPEG(byte[] yPlane, byte[] uPlane, byte[] vPlane, int width, int height)
{
    Color32[] rgbData = new Color32[width * height];
    for (int y = 0; y < height; y++)
    {
        for (int x = 0; x < width; x++)
        {
            int yIndex = y * width + x;
            byte Y = yPlane[yIndex];
            // Convert Y to RGB (black and white)
            byte R = Y;
            byte G = Y;
            byte B = Y;
            rgbData[yIndex] = new Color32(R, G, B, 255);
        }
    }

    Texture2D texture = new Texture2D(width, height, TextureFormat.RGBA32, false);
    texture.SetPixels32(rgbData);
    texture.Apply();

    //FlipTextureVertically(texture);

    byte[] jpegData = texture.EncodeToJPG();

    Debug.Log("JPEG image saved at output.jpg");

    return jpegData;
}

Can you assist me with adapting this code for getting the colored image?

Here is a link on how to do this: Visualize Camera Output | MagicLeap Developer Documentation

It includes information on how to convert yuv into rgb.

The YUV image is provided and separate planes Y,U, and V. The example that I sent uses a shader to combine them into RGB and then writes it to a texture.

That said we will try to reproduce the issue that you’re mentioning: not being able to capture an RGB image when device stream is active.

—-

To capture the image, are you using the script on the developer portal or have you created a custom script?

I tried with using the provided solution, but it's seems like I'm missing something since the output is just black. Should I just generate new Unlit shader and assign it a proper name or is there a shader script somewhere in the documentation? By the way, I'm using Simple camera script from the developer portal, for the capturing. I adapted the script to capture images instead of videos, using the example code as a guide. It's working well with both CV and Main camera.

Were you able to find the YUV_Camera_Shader in your project?

Make sure that the YUV_Camera_Shader is not stripped from your project on build. This can be done by making sure that an object in your scene has a material that uses the YUV_Camera_Shader shader or by adding it to the Project Settings / Player / Preloaded Assets

I didn't. I created the shader as a new Unlit shader and then I added it to preloaded assets. I added required dependencies on the script via serialized fields
image
And I subscribed OnCaptureDataReceived method to the OnRawImageAvailable. Is there anything else I should do?

You may want to add logs or break points to your app to see if the OnCaptureDataReceived gets the correct data and if the Render Texture displays correctly.

So the shader that should be used is the one generated by creating a new unlit shader? Thanks, I will add additional logs to check the data.

The shader is located under Packages/ Magic Leap SDK/ Runtime/Deprecated/ Camera/ Shaders/

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.