Abstract

User input to desktop and laptop computers is largely via the keyboard, pointing devices such as a mouse/trackpad, and to some extent, the touchscreen. Thus far, using eye gaze as an input, based on data from on-device cameras (e.g., laptop or desktop cameras) requires calibration steps; even so, since such cameras are relatively far from the user, the resulting input lacks precision. This disclosure describes techniques for human-computer interaction based on eye gaze derived from the user’s smart glasses. With user permission, the user’s eye movements, as captured by the camera on the user’s smart glasses, provides an additional interactive channel to the user’s other devices (e.g., laptop, desktop, etc.) to enable eye-gaze based actions such as scroll focus, text focus, notifications dismissal, window focus, auto-scrolling of text, etc.

Creative Commons License

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.

Share

COinS