Hi there! For various reasons, we are unable to use the MagicLeap Eye Calibration process in our workflow. There is currently no handsfree option which is a requirement for our application, and even if there were, it's not currently feasible for us to launch another application to calibrate.
I was wondering what exactly the calibration process does? I am using the Eye Tracking API and mlsdk v1.2.0. Upgrading to a later version is possible and we do plan to do so in the future.
Does the calibration process simply make an adjustment to the
vergence? Specifically, translating their positions and rotating to their respective rotations? If so, I am considering writing my own simplified calibration process so I can then apply these adjustments to the uncalibrated eye data myself. Would I be missing other calibration features by doing this?
In my experience, the eye tracking data can occasionally return nonsensical data when there is no calibration. For example, the reported eye vergence location is somewhere behind the head. And I haven't exactly pinned down how to resolve this state, but it seems like I need to move my head/eyes around some to get the Eye Tracking to pick back up again with valid data. In this state:
- There is no reported
MLResult_Okis returned when using
- The left_center, right_center, and vergence confidences are all either 0.75 or 1.
I have no way to recognize from the API that eye tracking is less than nominal in this state.
Our application would benefit from very accurate eye tracking, but for now I've implemented a kind of tolerance zone where the user only needs to look fairly close to a target to activate some features. This seems to decently account for the lack of calibration since everyone's eyes are different. And I am also accounting for when the user blinks or situations when no eyes are found, so I don't think the issue is due to blinking.
I feel like I am very close to having something that works reliably albeit with diminished accuracy. And if it's possible I'd like to do my own calibration! But the issue of occasionally nonsensical eye data can be rather disruptive.
To summarize my questions:
- Is the MagicLeap Eye Calibration process doing anything other than adjusting the position and rotation of the Eye Tracking Data?
- Is it possible for me to do my own Eye Calibration with reliable results?
- Are these occurrences of nonsensical Eye Tracking Data due to there being no Eye Calibration?
Thank you for anyone taking the time to read this far!!
Android API Level 29