Sylvie Gibet

Professor in Computer Science
University of Bretagne Sud
IRISA, Expression team
Campus de Tohannic, Vannes, France

Tel: (+33)297017243
E-Mail: sylvie.gibet @
Office Hours: Email only


  • Functional programming in Ocaml (Licence 1)
  • Algorithms and data Structure in Python (Licence 2)
  • Data processing and machine learning introduction (Master 1)
  • Movement and artificial intelligence (Master 2)
  • Interactive multimedia projects (Master 1-2)

Research in Gesture

My current research focuses on gesture as a language for expressing, feeling and interacting with systems in virtual worlds. My goal is to progress towards expressive representations of gestures that help users to better control and feel their interactions with the system they manipulate. This implies considering both the sensory-motor cues involved in motor performance, and the linguistic components that can be combined to give meaning to movement. Beyond the general public who are increasingly interested in motion-based interfaces, the main users are experts using trained gestures, engineers and multidisciplinary scientists, but also digital artists. My fields of application concern the study of gestures characterized by a high level of semantics and expressiveness, such as Sign Language gestures and gestures in Performative Arts. Several research axes are considered: Several research axes are considered:
  • Signing Avatars: text-to-sign and sign-to text translation in French Sign Language
  • Human Motion Synthesis: inversion and optimization models, concatenative synthesis, physical models
  • Classification, Annotation, Recognition using machine learning methods
  • Computer Music: percussive, pianistic, conducting gestures, musical instruments
  • Performative Arts: emotional theatrical movements, haptic control
Keywords: Gesture, Expressiveness, Analysis, Recognition, Synthesis, Sign Language, Musical Gesture, Performing Arts

Main Projects

Expressive Gesture

  • INVARIANT (2020-2022): Space-time invariants in gesture performances
  • EXPRESSIVE (2018-2019): Taxonomy of fundamental descriptors of expressive gesture

Sign Language Gesture: Modeling, Translation, and Synthesis

  • Video2LSF(2019-...): From videos to sign language recognition
  • Text2Sign (2016-2020): Mocap-driven animation of sign language avatars from phonological modeling
  • Region Bretagne - FEel (2014-2020): Facial expression analysis and synthesis
  • ANR SIGN3D (2012-2016): Editing Sign Language Gestures from High-Fidelity MoCap Data
  • ANR SIGNCOM (2008-2011): Communication in French Sign Language between Users and Avatars
  • CFQCU MARQSPAT (2009-2011): Spatial Markers in Three Sign Languages
  • Region Bretagne - SIGNE (2004-2008): Virtual signer in French sign language

Gesture for Performing Arts

  • MappEMG (2020-2022): electromyographic control of multi-sensory processes
  • ELEMENTS (2019-2021): Gestural control of natural sounds
  • INGREDIBLE (2012-2016): Modeling theatrical gestures
  • PERCUSSIVE (2012-2016): Learning percussive gestures
  • ICCASS (2006-2010): Control and animation of a virtual percusionist
  • COST-287 (2003-2007): Gesture control of audionumerical systems
  • HuGEx (2003-2007): Humanoid endowed with expressive Gestures