Gesture analysis and modelling for real-time interaction with digital media
15th November 2011, 2:00pm
Entrée libre et gratuite.
PCRI Bât 650, Salle 445 (near the entrance)
The nowadays ubiquitous use of multitouch interfaces and inertial measurement sensors put forward new usages of gestures in Human Computer Interaction. Particularly, there is a great interest in continuous gestures as the input of the interaction, as often proposed in recent interaction paradigms (for example the two fingers pinch to resize images). In this context, gesture recognition techniques could be used to further enhance interaction possibilities. In this talk, I will present our recent work in gesture analysis and modelling in the context of musical performance and artistic installation. I will first remind usual approaches for gesture control of sound for musical expression. Then I will present the methods developed at Ircam – Centre Pompidou for continuous gesture tracking (and recognition) and their use in complex situations (such as concerts or public installations), as well as the use of gesture invariants (e.g. rotation, scale) for expressive sound control. In this context, I will examine the need to take into account the temporal structure of gesture at different scales. Finally, I will extend our approach to higher-level gesture features, namely movement qualities, that are suitable for new interaction paradigm in artistic applications. Precisely, I will present and discuss various interfaces designed for dance with a specific focus on the notion of "movement qualities » that can be defined as the manner in which the movement is executed.
Baptiste Caramiaux is PhD candidate at Ircam Centre Pompidou and University Pierre et Marie Curie (Paris VI). His research interests mainly focus on the analysis and modeling of gestures for interacting with sounds. More precisely, his research interests lie in: the design of experiments for the analysis of how people bodily represent a sound using gestures intending to link both mental and gestural representations of sound; the modeling of gesture using Bayesian models; the development of tools for real time gesture analysis and control of sound (and other digital media). He received a M.Sci. in Acoustics, Signal Processing and Computer Science Applied to Music from jointly Ircam, Univ. Paris VI and Telecom ParisTech, in 2008.