Binaural sonification for navigation aid
In this thesis we an augmented reality system based on 3D sound and sonification whose aim is to provide navigation assistance for visually impaired users. The design of this system has been addressed in three ways.
First, 3D sound generation via binaural synthesis has limitations due to the problem of the need for HRTF individualisation. A new method based on brain plasticity is established to adapt individuals to HRTFs using an audio-kinaesthetic platform, reversing the standard paradigm. This method has shown the potential for a rapid adaptation of the auditory system to virtual auditory cues without the use of vision.
Second, spatial data sonification is investigated in the context of a system for locating and grasping objects in the peripersonnel space. Sound localization performance was examined by comparing real and virtual sound sources. On the basis of the results, a distance sonification method is developed with the aim of improving user performance. Rather than employing sonification by sound synthesis, the proposed sonification method varies parameters of an audio effect that is applied to a base sound. This method allows the user to select and change the base sound without requiring additional learning.
Finally, we present the concept of a new method of sonification designed to answer end-user needs in terms of aesthetics and sonification customization. “Morphocons” are short audio units whose aim is the construction of a sound vocabulary based on the temporal evolution of sound. An identification test highlights the efficiency of morphocons for conveying the same information with various types of sounds.
Keywords: augmented reality, 3D sound, sonification, navigation aid, spatial perception, auditory plasticity.