1 Sektion København, The Faculty of Engineering and Science (ENG), Aalborg University, VBN2 Aalborg University Copenhagen, The Faculty of Humanities, Aalborg University, VBN3 Multimodal Interactive Experiences, The Faculty of Engineering and Science (ENG), Aalborg University, VBN4 Department of Architecture, Design and Media Technology, The Faculty of Engineering and Science (ENG), Aalborg University, VBN5 The Faculty of Engineering and Science (TECH), Aalborg University, VBN6 Sound & Music Computing, The Faculty of Engineering and Science (ENG), Aalborg University, VBN
In this paper we present an experiment whose goal is to investigate subjects’ ability to match pairs of synthetic auditory and haptic stimuli which simulate the sensation of walking on different surfaces. In three non-interactive conditions the audio–haptic stimuli were passively presented through a desktop system, while in three interactive conditions participants produced the audio–haptic feedback interactively while walking. Results show that material typology (i.e., solid or aggregate) is processed very consistently in both the auditory and haptic modalities. Subjects expressed a higher level of semantic congruence for those audio–haptic pairs of materials which belonged to the same typology. Furthermore, better matching ability was found for the passive case compared to the interactive one, although this may be due to the limits of the technology used for the interactive haptic simulations.
Applied Acoustics, 2014, Vol 75, Issue 1, p. 59-66