Abstract
This document describes techniques for a system that pairs a wearable device, such as a smartwatch or fitness tracker, with an Extended Reality (XR) device such as an XR headset, Virtual Reality (VR) headset, Augmented Reality (AR) glasses, or Head-Mounted Display (HMD), to improve how the XR device tracks hand movements. Typically, XR devices may use cameras and computer vision for tracking, but this optical tracking may decrease in accuracy when hands are out of view, such as behind a user’s back or under a desk. In some examples, the XR device includes a calibration loop that uses optical data to calibrate smartwatch inertial data. Upon visual occlusion (e.g., the hand is no longer within the field of view of the XR device), the headset may transition to using the calibrated inertial smartwatch data. In some examples, tracking data may be constrained by a biomechanical arm model as an additional measure to increase accuracy of the tracking data by ensuring that the tracking data is feasible with respect to the biomechanics of an arm of the user. By dynamically transitioning between tracking techniques, the XR device may enhance user immersion during extended reality.
Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 License.
Recommended Citation
Shevde, Sumukh A. and Chauhan, Sudeep, "OPTICAL AND INERTIAL TRACKING FOR EXTENDED REALITY", Technical Disclosure Commons, ()
https://www.tdcommons.org/dpubs_series/10028