Abstract
ABSTRACT
Manual logging for health and wellness often results in low user engagement due to the high effort and friction involved for data entry. Additionally, ongoing activity observation using high-fidelity sensors on wearable devices is constrained by finite battery life and computational resources. To address these challenges, a unified architecture is described that facilitates just-in-time adaptive interventions.
A cascaded sensing layer is utilized to reduce power consumption. A low-power sensor screens for motion signatures indicative of potential events, such as hand-to-mouth gestures, on a persistent basis. Upon detection, a high-power sensor is triggered to capture rich contextual data. This data is processed by a multimodal large language model to create a structured, private record of user activities. A proactive engine then synthesizes the user history, preferences, and real-time context to deliver personalized advice via audio or a subtle display. This approach enables automated, low-friction wellness management while preserving device battery life and providing timely, actionable support.
Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 License.
Recommended Citation
Abadie, Cecilia; Purohit, Aveek; Bavishi, Pinal; Guidroz, Theo; Jain, Ayush; and Mezerreg, Ines, "Just-in-Time Adaptive Interventions for Wellness Management via Context-Aware Wearable Devices", Technical Disclosure Commons, (February 11, 2026)
https://www.tdcommons.org/dpubs_series/9314