Guidance for MR behavior experiment with static object positioning and collecting data
Hi, I am currently looking for guidance in designing an AR/MR behavioral experiment where I need to track participants' behavior while they interact and navigate in a mixed reality space physically. Here are the requirements and the details of my project:
- I need to place/anchor objects are fixed locations.
- The experiment will be repeated over time among many participants at different times, so I need functionality to always spawn the objects at the designated fixed locations every time I reopen my AR app
- I'm planning all my developments on Unreal.
Here are my doubts based on my current level of understanding:
- I was planning to use spatial anchors but I am not sure how to exactly display that. I read that I can scan my environment at export it as a .gbl file. How can I use that in Unreal to anchor objects in my real space?
- Secondly how can I make sure in my unreal project, that objects are spawned based on the spatial anchor and not relative to where I turn on the headset? Do I need to use a QR to initiate spatial tracking?
- Lastly since I need to collect positional and orientation data of user's behavior, how can I store this data on Magicleap, is it the same as coding for a
normal unreal project or do I need to use some special file io for collecting and saving data while the application is running on ML? - Is there a way to directly collect eye tracking data on ML2, how can I enable this on my app using Unreal?
Lastly, if you think there is an alternative approach that I'm not aware of, feel free to suggest I will be so happy for any input.