We propose ways of enriching the timbral potential of gestural sonic material captured via piezo or contact microphones, through latency-free convolution of the microphone signal with grains from a sound corpus. This creates a new way to combine the sonic richness of large sound corpora, easily accessible via navigation through a timbral descriptor space, with the intuitive gestural interaction with a surface, captured by any contact microphone. We use convolution to excite the grains from the corpus via the microphone input, capturing the contact interaction sounds, which allows articulation of the corpus by hitting, scratching, or strumming a surface with various parts of the hands or objects. We also show how changes of grains have to be carefully handled, how one can smoothly interpolate between neighbouring grains, and finally evaluate the system against previous attempts.
|Title of host publication||Proceedings of the International Conference on New Interfaces for Musical Expression|
|Publisher||Goldsmiths, University of London, UK|
|Number of pages||4|
|Publication status||Published - Jun 2014|
Schwarz, D., Tremblay, P. A., & Harker, A. (2014). Rich Contacts: Corpus-Based Convolution of Audio Contact Gestures for Enhanced Musical Expression. In Proceedings of the International Conference on New Interfaces for Musical Expression (pp. 247-250). Goldsmiths, University of London, UK. http://www.nime.org/proceedings/2014/nime2014_451.pdf