Human-Sound Interaction

Balandino Di Donato, Christopher Dewey, Tychonas Michailidis

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review


This paper explores the idea of direct and embodied interaction with sound as if it would be a tangible and visible object. We dene this as Human-Sound Interaction (HSI). A prototype system named SoundSculpt is created to explore HSIs through cross-modal representation of the auditory feedback. Our system uses holographic projection and mid-air haptic feedback for representing visually and making tangible morphological and spatial aspects of sounds. Sounds are visualised and presented as a deformable container whose shape is determined by its associated spectral signature. Sounds have interactive possibilities demonstrated by the aordances of the visual projection, the midair haptic feedback and the sound itself. With an interaction style analogous to moulding and sculpting, the user is able to boost and attenuate frequency components of the sound by interacting directly with the holographic visualisation of the container projected mid-air using hand gestures.
Original languageEnglish
Title of host publicationProceedings of the International Conference on New Interfaces for Musical Expression
Subtitle of host publicationNIME 2020
Number of pages3
Publication statusAccepted/In press - 30 Mar 2020
EventInternational Conference on New Interfaces for Musical Expression: The Accessibility of Musical Expression - Royal Birmingham Conservatoire, Birmingham, United Kingdom
Duration: 21 Jul 202025 Jul 2020

Publication series

NameProceedings of the conference on New Interface for Musical Expression (NIME)
ISSN (Print)2220-4806


ConferenceInternational Conference on New Interfaces for Musical Expression
Abbreviated titleNIME 2020
CountryUnited Kingdom
Internet address

Fingerprint Dive into the research topics of 'Human-Sound Interaction'. Together they form a unique fingerprint.

Cite this