Rich Contacts: Corpus-Based Convolution of Audio Contact Gestures for Enhanced Musical Expression

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We propose ways of enriching the timbral potential of gestural sonic material captured via piezo or contact microphones, through latency-free convolution of the microphone signal with grains from a sound corpus. This creates a new way to combine the sonic richness of large sound corpora, easily accessible via navigation through a timbral descriptor space, with the intuitive gestural interaction with a surface, captured by any contact microphone. We use convolution to excite the grains from the corpus via the microphone input, capturing the contact interaction sounds, which allows articulation of the corpus by hitting, scratching, or strumming a surface with various parts of the hands or objects. We also show how changes of grains have to be carefully handled, how one can smoothly interpolate between neighbouring grains, and finally evaluate the system against previous attempts.
Original languageEnglish
Title of host publicationProceedings of the International Conference on New Interfaces for Musical Expression
PublisherGoldsmiths, University of London, UK
Pages247-250
Number of pages4
Publication statusPublished - Jun 2014

Publication series

Name
ISSN (Electronic)2220-4806

Fingerprint Dive into the research topics of 'Rich Contacts: Corpus-Based Convolution of Audio Contact Gestures for Enhanced Musical Expression'. Together they form a unique fingerprint.

  • Cite this

    Schwarz, D., Tremblay, P. A., & Harker, A. (2014). Rich Contacts: Corpus-Based Convolution of Audio Contact Gestures for Enhanced Musical Expression. In Proceedings of the International Conference on New Interfaces for Musical Expression (pp. 247-250). Goldsmiths, University of London, UK. http://www.nime.org/proceedings/2014/nime2014_451.pdf