Abstract
Navigation systems can calculate routes based on metrics such as travel time and distance, which may not account for the subjective sensory experience of a journey for individuals with sensory sensitivities. Systems and methods are described for sensory-aware routing that can ingest and process various geolocated data streams, including, for example, visual, acoustic, and semantic information. A multimodal inference engine may analyze this data to quantify sensory attributes for geographic path segments, generating a multidimensional sensory vector that can represent characteristics such as auditory intensity and visual complexity. A personalized routing engine could then utilize an expanded cost function, incorporating a user's defined sensitivity profile with other metrics, to calculate travel itineraries. This process can support the generation of routes selected to reduce a user’s exposure to potentially distressing stimuli, providing paths that may be more suitable for sensory comfort.
Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 License.
Recommended Citation
N/A, "Navigation System Using Sensory Profiles and a Personalized Cost Function", Technical Disclosure Commons, (April 16, 2026)
https://www.tdcommons.org/dpubs_series/9814