Abstract
This disclosure describes a solution for calibrating coordinate frames of a head-worn device. Unlike existing eye-tracking systems in head-worn systems, such as mixed reality (MR)/augmented reality (AR) glasses or head-mounted displays (HMDs), that require disruptive calibration, such as following virtual dots, this approach correlates a user’s gaze with real-world objects. By utilizing scene understanding (e.g., simultaneous localization and mapping (SLAM) or object recognition) to map the physical environment, the device prompts the user to gaze at a real-world object or audio source. The system then monitors the user's gaze and determines a spatial offset between the tracked gaze and the actual object location within the component coordinate frames. This offset automatically and continuously refines an artificial intelligence (AI) model to update calibration parameters. The process functions with or without a display or scene imagery, reducing parallax errors and maintaining accuracy during device shifts without interrupting the user’s workflow.
Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 License.
Recommended Citation
Gonzalez Franco, Mar; Ahuja, Karan; Yang, Qiao; Gonzalez, Eric Jordan; Colaco, Andrea; Patel, Khushman Jayantilal; and Gurumurthy, Prasanthi, "A Method For Calibrating Head-Worn Systems", Technical Disclosure Commons, (February 10, 2026)
https://www.tdcommons.org/dpubs_series/9306