I was wondering if it's possible to go through the eye calibration process without using the hands or the controller. I managed to launch the eye calibration app from my Unity app, but still, it's mandatory to select the eye-tab, start the calibration process and after successfull calibration, confirm a window to close the app again. Hence, three controller (or hand) clicks are necessary. But unfortunately, my app heavily relies on an accurate eye-tracking and prevents the use of hands or a controller.
Is there some kind of API to calibrate the eyes directly within my Unity app? Or could I jump directly into the calibration process without having to manually press the button beforehand? Or is there any other possibility?
Big thanks for your support.
MLEyeCalibration is the API that reports eye calibration state only.
We do have custom android intent (com.magicleap.intent.action.EYE_CALIBRATION) that you can use from your app to launch eye calibration process but at the moment you'd need input to complete the process.
you can check capi voice_intents sample app to see how to launch eye calibration via android intents.
What is the flow that you are trying to create when calibrating the eyes? Do you want to avoid all input? Would you want to use voice input for calibration? Do you want a module to come up within your app without going to a separate app?
My collegue went on vacation so I'll answer the questions.
@ejurinsky Thanks for the inputs, we achieve a similar behaviour by launching the app (1 interaction more than with your intent). Still it is needed to click "start" and "quit" which for us is an issue.
The flow that we want to achieve would be to avoid all input. We do not have a preference between a module and a separate app, as long as we can ensure a flow without controller/hand/voice interaction.
General thinking, if the calibration part (looking at the dots) was independant to anything and developer could start it, we could ensure the best calibration results thanks to Magic Leap experience with the addition to customize the calibration flow (explanation to the user / interaction to start/end the calibration / etc). Be it either a module or just a specific flow of the Custom Fit app by calling a specific intent is not what matters.
Thanks for your support
I just wanted to let you know that we have captured your feedback and this is still in progress for us. Thank you for your patience.
Would starting and stopping eye calibration using voice intents be sufficient?
thanks for the update. Yes using voice intents is sufficient, as long as it doesn't require inputs (controller or/and hand) to go through the calibation.