Hey everyone,
I am doing a college project in which I need to use the MagicLeap 2 headset to stream real time point cloud data of the wearer's nearby environment (0.5-2m) to my PC or just stream the sensor data that will be processed into a point cloud on my PC. I have almost no experience with unity and want to know if it's something that's possible without Unity? Even if it is or not the main thing I want to know is, what do I need to do in order to achieve this functionality from the magicleap 2. I have been spending some time researching on it but I am not really sure how to begin working with it. A brief step wise breakdown of the whole pipeline would be extremely helpful for me!
Thanks in advance!
p.s. I have enterprise edition if it's something that matters
Are you more familiar with OpenXR in C++?
Magic Leap 2 does not provide a built-in capability to stream sensor data directly to a PC, but you can achieve this functionality using the Pixel Sensor API or the (deprecated) MLSDK. The Pixel Sensor API is accessible via the Pixel Sensor OpenXR extension and allows you to retrieve depth data and other inputs from sensors such as the ToF sensor and world cameras.
If you’re open to using Unity, you could combine these APIs with the Unity WebRTC package to stream frames over the network. This approach would be more straightforward if you’re prioritizing real-time data and can work with image quality trade-offs.
Alternatively, if real-time frame rate is less critical and you prioritize data quality, you could:
-
Capture sensor data (e.g., frames).
-
Convert it into a PNG or similar format.
-
Send it over the network using UDP or TCP.
Keep in mind that this method will result in significantly lower application performance.
Another approach would be to encode the depth data or captured images as an H.264 stream using the Android Media Encoder APIs, which could then be transmitted over the network without relying on WebRTC. However, custom encoding via the Android SDK is nontrivial it requires a solid understanding of Android s Media APIs and additional work to optimize encoding.
Note: we can help you with the relevant SDK APIs, but the implementation of the streaming pipeline itself will require additional resources and outside expertise, since it is not specific to Magic Leap and falls outside the scope of our forum support. Let us know which direction you’re leaning, and try to point you in the right direction 