Abstract
This disclosure describes the novel gaze and head tracking calibration procedure for PC users. The process of estimating a participant's ocular and head-pose features to serve as the foundation for a completely personalized and precise gaze point computation is known as calibration. The technique for calibration proposed in this disclosure applies to both types of trackers: external cameras mounted to the PC and software that uses the PC’s built-in camera. The current techniques for such calibrations require the PC user to focus on the pre-defined gaze points at the corners of the screen before gaze-tracking can be performed by software. On the other hand, the proposed solution performs calibration more effectively by calculating the offset between the screen coordinates of unintentional mouse-click and the estimated gaze predicted by the software in real-time without user’s notice in the background. The offset is then fed to a reinforcement learning ML model to perform Q-learning based calibrations in real-time for the user.
Creative Commons License
This work is licensed under a Creative Commons Attribution-Share Alike 4.0 License.
Recommended Citation
INC, HP, "Novel and unobtrusive technique for gaze and head tracking calibration in real-time for PC users", Technical Disclosure Commons, (September 09, 2025)
https://www.tdcommons.org/dpubs_series/8572