A system and method are disclosed that use data provided by a mobile device equipped with sensors to guide a human or machine along a vector path through sensory feedback. The system uses motion and depth sensor information and object recognition to create models of an interior space, which facilitate movement for a person who has never been inside a space before, and to use those models for navigation. The method utilizes interior space maps to identify "safe" vectors. A realtime algorithm compares the user’s location and direction of movement with the desired path in the model, providing a measure of the deviation. It then plays an auditory and/or haptic signal that focuses the user’s attention to follow a safe path in response to the deviation. Using realtime object recognition sensor data allows the detection of spatial obstacles that are otherwise difficult to navigate using traditional solutions for aiding the visually impaired.
Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.
Mulford, Brian; Kwa, Ben; Lieske, Jay; and Norris, Wade, "Sensor Based Auditory And Haptic Guidance System", Technical Disclosure Commons, (January 18, 2017)