Back to Search Start Over

ASSIST: Personalized Indoor Navigation via Multimodal Sensors and High-Level Semantic Information

Authors :
Vishnu Nair
Zhigang Zhu
Greg Olmschenk
Manjekar Budhai
William Seiple
Source :
Lecture Notes in Computer Science ISBN: 9783030110239, ECCV Workshops (6)
Publication Year :
2019
Publisher :
Springer International Publishing, 2019.

Abstract

Blind & visually impaired (BVI) individuals and those with Autism Spectrum Disorder (ASD) each face unique challenges in navigating unfamiliar indoor environments. In this paper, we propose an indoor positioning and navigation system that guides a user from point A to point B indoors with high accuracy while augmenting their situational awareness. This system has three major components: location recognition (a hybrid indoor localization app that uses Bluetooth Low Energy beacons and Google Tango to provide high accuracy), object recognition (a body-mounted camera to provide the user momentary situational awareness of objects and people), and semantic recognition (map-based annotations to alert the user of static environmental characteristics). This system also features personalized interfaces built upon the unique experiences that both BVI and ASD individuals have in indoor wayfinding and tailors its multimodal feedback to their needs. Here, the technical approach and implementation of this system are discussed, and the results of human subject tests with both BVI and ASD individuals are presented. In addition, we discuss and show the system’s user-centric interface and present points for future work and expansion.

Details

ISBN :
978-3-030-11023-9
ISBNs :
9783030110239
Database :
OpenAIRE
Journal :
Lecture Notes in Computer Science ISBN: 9783030110239, ECCV Workshops (6)
Accession number :
edsair.doi...........c4c97127dc953d5f3f054d2444ec28ca
Full Text :
https://doi.org/10.1007/978-3-030-11024-6_9