Repository | Series | Book | Chapter
Extracting commands from gestures
gesture spotting and recognition for real-time music performance
pp. 72-85
Abstract
Our work allows an interactive music system to spot and recognize "command" gestures from musicians in real time. The system gives the musician gestural control over sound and the flexibility to make distinct changes during the performance by interpreting gestures as discrete commands. We combine a gesture threshold model with a Dynamic Time Warping (DTW) algorithm for gesture spotting and classification. The following problems are addressed: i) how to recognize discrete commands embedded within continuous gestures, and ii) an automatic threshold and feature selection method based on F-measure to find good system parameters according to training data.
Publication details
Published in:
Aramaki Mitsuko, Derrien Olivier, Kronland-Martinet Richard, Ystad Sølvi (2014) Sound, music, and motion: 10th international symposium, CMMR 2013, Marseille, France, October 15-18, 2013. revised selected papers. Dordrecht, Springer.
Pages: 72-85
DOI: 10.1007/978-3-319-12976-1_5
Full citation:
Tang Jiuqiang, Dannenberg Roger B. (2014) „Extracting commands from gestures: gesture spotting and recognition for real-time music performance“, In: M. Aramaki, O. Derrien, R. Kronland-Martinet & S. Ystad (eds.), Sound, music, and motion, Dordrecht, Springer, 72–85.