This publication describes techniques and methods for improving user interaction (e.g., dismissal, expansion) with application notifications on computing devices, such as smartphones, tablets, or smart glasses. The techniques incorporate the utilization of multiple on-device sensors (e.g., a camera sensor, an accelerometer) that can measure various means of user input. The methods as described herein afford users convenient and quick notification interaction options. Machine-learned (ML) models that employ gaze detection, custom gestures (e.g., hand movement, eye clipping), and/or user actions (e.g., shaking or rotating the device) can provide users the ability to expand, individually dismiss, or batch dismiss application notifications exclusive of or in conjunction with application notification priority levels (e.g., the urgency of the notification, preselected user importance levels).
Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.
Carbune, Victor and Feuz, Sandro, "Using Sensors to Improve User Interaction with Application Notifications", Technical Disclosure Commons, (July 10, 2019)