Non-Speech Navigation

Study of two non-visual techniques for delivering navigation information 
Publication (ICAD 2007) : Speech and Non-Speech Audio: Navigational Information and Cognitive Load, Proceedings of the 13th International Conference on Auditory Display [pdf]

ABSTRACT

Cell phones and other mobile devices let people receive information anywhere, anytime. Navigation information –directions and distance to a destination, interesting nearby locations, etc. – is especially promising. However, there are challenges to delivering information on a cell phone, particularly 
with a GUI. GUIs aren’t ideal when a person’s visual attention is elsewhere, e.g., scanning for landmarks, assessing safety, etc. And they don’t work at all for blind people, who particularly need navigation assistance. Our work responds to this challenge. We investigate the use of two non-visual techniques for delivering navigation information, speech and sonification [[3], . We conducted an experiment to compare user performance with and preference for the two techniques, in both single task (navigate to a target) and 
dual task (navigate to a target and respond to an auditory stimulus) conditions. Users performed better with and preferred sonification in both conditions. We discuss the implications of these results for the design of navigation aids. [Keywords: Navigation, sonification, speech, cognitive load, secondary task]
 

Contribution

  • Investigated the use of two non-visual techniques for delivering navigation information, speech and sonification (non-speech audio)
  • Designed an implemented (Java) an auditory system to navigate in the 2 dimensional space
  • Designed and ran experiments, analyzed the results
  • Wrote the paper