Request for Support with Livestream Integration on Magic Leap 2

Dear Support Team,

I am currently experiencing challenges with integrating livestream functionality on the Magic Leap 2. Below, I have outlined the specific methods I have attempted, along with brief descriptions of each approach. I would appreciate any guidance you could regarding playing a livestream in the Magic Leap 2.

  1. VideoPlayer Canvas/Quad/Plane
    Description: Utilizing Unity's VideoPlayer component to display video content on various surfaces (Canvas, Quad, Plane) within the application.

Personal Experience: The best outcome I was able to achieve was to play videos in the unity player itself but never in the magic leap. Any file I used would not play in the ML2 and any stream regardless of format, encoding, and sample projects would not play in the video player.

  1. Using Google API for YouTube Livestreams
    Description: Attempting to access and stream a compatible livestream from YouTube using the Google API.

Personal Experience: This was an attempt to get a link that the videoplayer component in the project would accpet but standard youtube streams would not work and the direct secure links accessed through their API would not work either.

  1. Accessing HLS Link of a Twitch Stream
    Description: Trying to retrieve and stream an HLS link directly from a Twitch broadcast. GitHub - dudik/twitch-m3u8: Get the stream URL of a Twitch livestream or past broadcast (VOD).

Personal Experience: With both the github above and chatGPT I was able to access the HLS link to a twitch livestream in multiple ways but was unable to play the stream in unity.

  1. Streaming via VLC Player
    Description: Using VLC to stream content from a camera, screen, or preexisting livestream to the device.

Personal Experience: I attempted to stream my camera, desktop, files, or preexisting livestreams with every combination of output and encoding and while I could see the video play locally in my browser I could never get the video to play in unity

  1. Using OBS (Open Broadcaster Software)
    Description: Configuring OBS to stream content and integrating it with the Magic Leap device.

Personal Experience: [OBS was another attempt to find a link compatible with unity but did not have any success

  1. Custom HLS Stream with SRS and Docker
    Description: Setting up a custom HLS stream using SRS (Simple Realtime Streaming) with Docker, and streaming in m3u8 format.

Reference: SRS Unity Documentation
Personal Experience: The entire process of setting up an HLS stream using SRS and docker was a success and I was able to stream the video in multiple formats and view it locally, across devices, and in every manner possible besides through the unity videoplayer, unity render streaming, websockets discussed below, or webRTC.

  1. Working Video from UF Server
    Description: Successfully loading a video from a UF server with the video player, but facing issues with streaming functionality.

Personal Experience: This was the last attempt of focusing on the link to the stream since there are similar projects that pull a video and are able to play it but unfortunately when I tried to run the video or a stream in the ML2 the quad/plane/canvas/any method of viewing the video would not load.

  1. Unity Render Streaming
    Description: Attempting to use Unity's Render Streaming capabilities to facilitate video streaming.
    Reference: Unity Render Streaming Documentation
    Personal Experience: I followed every sample and could not get anything to play in the ML2 but had success in browser and across unity projects.

  2. WebSocket Sharp Tutorial
    Description: Following the WebSocket Sharp tutorial to implement WebSocket communication for video streaming.
    Reference: WebSocket Sharp GitHub

    Personal Experience: Could not view anything in a running project on the ML2 or in the ML2’s browser.

  3. Unity WebRTC Integration
    Description: Exploring the use of Unity WebRTC for real-time communication and video streaming through available tutorials.
    References:

  1. Additional WebRTC Projects
    Description: Investigating further WebRTC projects relevant to Unity for potential streaming solutions.
    References:
  1. Klak NDI Tutorial
    Description: Attempting to use the Klak NDI tutorial for network device interface streaming within Unity.
    Reference: Klak NDI GitHub
    Personal Experience: I was able to send streams between two devices but never to the magic leap.

I appreciate your time and assistance in addressing these challenges. I look forward to your guidance on how to proceed. The end goal is to figure out a way to play any sort of livestream from anywhere on the Magic Leap 2 and have a QR code with Vuforia or AprilTag as a parent to the livestream so it can be tracked. We already have the working mechanisms/projects that have QR codes and AprilTags.

Thank you.

Best regards,
Andrew Dodds

The Magic Leap 2 supports Android 10’s standard Media Decoding API. This means you can decode video streams using a plugin that uses the Android APIs as long as it works on Vulkan and x86_64.

—-

The MLVideoPlayer component is deprecated but supported more stream options than Unity’s player. You can see the example in the Unity v1.12.0 Unity Examples Project. Note, the api is no longer maintained so you would need to use this at your own risk.

You can also see if the AVPro Video player works for your use-case. I recently saw that they support Vulkan and Android x86_64 .

—-

When choosing the Camera device in Unity you can will either need to specify the camera name for the WebCamera Texture, or the index when using the MLCamera API (recommended)

The Unity Web camera texture only supports accessing the Mixed Reality capture (virtual + physical contact) if you need to access the image without the virtual content you will need to use the MLCamera api and target the CV stream.

You cannot access the same stream twice (Main Camera vs CV Camera)

Simple Unity Webrtc Example:

I spent today setting up the MagicLeap2UnityWebRTCExample and it works perfectly. I was wondering if it is possible to take the renderTexture that is displaying the video feed and make it a static object so that we could update its position with a QR code or AprilTag rather than having the video feed follow the user's gaze in the Magic Leap.

Thank you so much,
Andrew Dodds

Yes, you will just need to edit the components on the transform and disable the follow script.

Disabling the follow script works fine. Adding any component from the UI controller to a vuforia image target results in a completely black screen. This is after adding vuforiaBehavior to the main camera. Is there a suggested method to tracking the video feed?

Unfortunately, we're not too familiar with Vuforia on this forum. I recommend reaching out through their support channels:

https://developer.vuforia.com/support

What would be the easiest way to track the video feed or make the feed relative to a QR code or another mechanism. We have previously used vuforia but if there is a better way we would like to try.

Thank you

I’m not sure what functionality you are looking for exactly but you can see a list of out SDK features on our developer documentation. For example Marker Understanding: