Back to Search Start Over

Sound Selection By Gestures

Authors :
Caramiaux, Baptiste
Bevilacqua, Frédéric
Schnell, Norbert
Equipe Interactions musicales temps-réel
Sciences et Technologies de la Musique et du Son (STMS)
Institut de Recherche et Coordination Acoustique/Musique (IRCAM)-Université Pierre et Marie Curie - Paris 6 (UPMC)-Centre National de la Recherche Scientifique (CNRS)-Institut de Recherche et Coordination Acoustique/Musique (IRCAM)-Université Pierre et Marie Curie - Paris 6 (UPMC)-Centre National de la Recherche Scientifique (CNRS)
Source :
New Interfaces for Musical Expression (NIME), New Interfaces for Musical Expression (NIME), 2011, NA, France. pp.1-1
Publication Year :
2011
Publisher :
Zenodo, 2011.

Abstract

cote interne IRCAM: Caramiaux11b; None / None; National audience; This paper presents a prototypical tool for sound selection driven by users' gestures. Sound selection by gestures is a particular case of "query by content" in multimedia databases. Gesture-to-Sound matching is based on computing the similarity between both gesture and sound parameters' temporal evolution. The tool presents three algorithms for matching gesture query to sound target. The system leads to several applications in sound design, virtual instrument design and interactive installation.

Details

Database :
OpenAIRE
Journal :
New Interfaces for Musical Expression (NIME), New Interfaces for Musical Expression (NIME), 2011, NA, France. pp.1-1
Accession number :
edsair.doi.dedup.....05dae2d1fbd3ecd379ad3d54494ab633
Full Text :
https://doi.org/10.5281/zenodo.1177976