Can magic leap 2 work in this way?

This is the first time I develop applications in AR. So my questions might be stupid. I try to build an industrial application where users use it generaly in dark rooms, so I want to use an extra head tracking system to track their heads in stead of the built in one, then render the content and display it on usesers glasses, I am now choosing the right AR/VR glasses and want to make sure if magic leap 2 allow me to do it.

The pipeline I prefer is following:
first eye tracking get the angles of eyes.
2nd the seperate head tracking system get the head coordintes
3rd, eye angles together with the head coorinates are sent to unity or other engine and render the views for both eyes accordingly

My questions are:
1.Can I magic leap 2 in this way? If it works, will this pipline have a larger latency than the native one, assume the head tracking system is way faster than the 120Hz refresh rate of the display.
2. In general, which module handles the eye angle and head coordinates to render view convertion part, is it unity sdk or mrtk or urp or something else? Do I need to write this algorithm by my self?
3. Does it render every frame of scenes base on the angles of eyes detected in real time or not, assume if I don't need the eye focus interaction.
4. Does the unity or any compatible engine compatible with magic leap2 currently support this external head tracking input?
5. Does magic leap 2 have a display only mode, in which I can let unity or other engines/apps display the content on magic leap 2 directly using the type-c port, without interfereing any thing with head or eye tracking data?
6. Is the 2000nits brightness the final light energy to eyes after all the optical components or the brightness of the light source before passing through all the optical components?
7. When I use AR cloud , assume there is sufficient hardware resource, the document said ar cloud supports up to 10,000m2 of room. Is this the biggest size one user can experience for one project, what if I want to build a construction site larger than this size?

Thanks. Have to understand how magic leap 2 works before make the decision, really love the performance of magic leap 2.

Hi @JCollins,

Thank you for reaching out. We are so glad to hear that you are interested in the Magic Leap 2 headset.

I will answer some of your questions here. For the others, I will need to consult my team to make sure I am giving you the most accurate information possible.

1.) The Magic Leap 2 headset is able to work with external hardware through the USB-C port on the compute pack as long as that device is compatible with Android devices. There is only one port, so it is not possible to charge the headset while using external hardware.

2.) The eye data is securely acquired and distributed by the headset with permissions. Once that data is acquired, it can be used in your applications by accessing it using our API's.

3.) Eye input is not required for the most basic functions of the headset. We only acquire eye data if the user grants permissions.

4.) Unity does not offer built-in support for some external hardware. You would have to check with Unity to see if they have plugins for the particular hardware you are using.

7.) AR Cloud allows you to have spatial maps that are larger than 10,000m^2.

We appreciate your curiosity and I will report back to you as soon as I learn more about questions 5 and 6.

Best,

El

Could you please provide an example of what you mean by this? I want to make sure that I can help to the best of my ability.

In regards to the sixth question, nits represent the amount of light emitted per unit area rather than energy.

The display brightness is sufficient for bright environments. In a dark environment, the display appears very bright and are usually dimmed down to avoid discomfort and save battery power.

Display only mode I called is more like a streaming mode, in which the headset get video or sequence of images from a computer, plays whatever the computer sends, just like a monitor on ones eyes. If it works like that, will there be any latency?

It sounds like you want to do either remote rendering - which is supported and would not require you do do all of the manual work you are proposing.

We don't support DisplayPort alt-mode input (direct type-c input).

Are you trying to extend your computer's display?

Let me know if you have any further questions.

Best,

El

Yes. Basically, I want to extent the display using the display only mode. It dosn't have to be with a hardware level direct type-c, if I can use some software to display what displayed on my computer screen on this glasses at a fairly low latency, then, it is fine. Just like some VR glasses does.

Hi @JCollins,

For this use case, we recommend checking out Remote Rendering.

Let me know if you have any further questions.

Best,

El