Projects per year
Abstract
With the widespread use of smartphones that have multiple sensors and sound processing capabilities, there is a great potential for increased audience participation in music performances. This paper proposes a framework for participatory mobile music based on mapping arbitrary accelerometer gestures to sound synthesizers. The authors describe Handwaving, a system based on neural networks for real-time gesture recognition and sonification on mobile browsers. Based on a multiuser dataset, results show that training with data from multiple users improves classification accuracy, supporting the use of the proposed algorithm for user-independent gesture recognition. This illustrates the relevance of user-independent training for multiuser settings, especially in participatory music. The system is implemented using web standards, which makes it simple and quick to deploy software on audience devices in live performance settings.
Original language | English |
---|---|
Pages (from-to) | 430-438 |
Number of pages | 9 |
Journal | AES: Journal of the Audio Engineering Society |
Volume | 66 |
Issue number | 6 |
DOIs | |
Publication status | Published - 18 Jun 2018 |
Fingerprint Dive into the research topics of 'User-independent Accelerometer Gesture Recognition for Participatory Mobile Music'. Together they form a unique fingerprint.
Profiles
-
Gerard Roma
- Department of History, English, Linguistics and Music - Senior Research Fellow in Interactive Machine Listening
- School of Music, Humanities and Media
- Centre for Research in New Music - Member
Person: Academic
Projects
- 1 Active
-
FluCoMa: Fluid Corpus Manipulations
Tremblay, P. A., Green, O., Roma, G., Harker, A., Clarke, M. & Dufeu, F.
1/09/17 → 31/08/22
Project: Research