Abstract

Eye-tracking systems in head-mounted displays are often compromised by environmental and physiological factors, such as sensor washout from high ambient infrared light, physical occlusion from eyelashes or makeup, and mechanical misalignment. Such failures typically result in erratic input or a total loss of interface responsiveness.

A method is disclosed for monitoring the health or status of eye-tracking sensors and executing an automated fallback to alternative input modalities. Diagnostic heuristics are used to categorize failure modes by analyzing pixel-intensity distributions, glint signal-to-background ratios, and pupil-edge continuity. When a failure is confirmed and persists beyond a specified temporal threshold, a transition to hand-gesture tracking or head-pose steering is initiated. This process includes semantic inheritance to maintain the cursor position during the hand-off. The disclosed technology improves operational robustness and reduces user interface latency by ensuring continuous input availability regardless of the eye-tracking state.

Creative Commons License

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.

Share

COinS