Skip to main content Skip to footer
WhoLoDance: Whole-Body Interaction Learning for Dance Education

WhoLoDance: Whole-Body Interaction Learning for Dance Education


European Commission
H2020-ICT-2015-single-stage (Topic: Technologies for better human learning and teaching, ICT-20-2015) Research and Innovation Action

Total value of project


Value to Coventry University


Project team

Professor Sarah Whatley, Dr Rosamaria Cisneros, Dr Ruth Gibson, Dr Karen Wood

WhoLoDance logo

European Commission logo


Lynkeus SRL- Co-ordinator (Italy), Athena RC (Greece), Motek Entertainment (Netherlands), Politecnico di Milano (Italy), Università di Genova (Italy), Peachnote GmbH (Germany), INSTITUTO STOCOS (Spain), K. DANSE (France), Lyceum Club of Greek Women (Greece)

Duration of project

01/10/2016 - 31/12/2018


Project overview

By applying Multimodal Sensing and Capturing Analysis, WhoLoDance will make use of advanced motion capture technologies as well as of EMG, bio-sensors, video, audio and accelerometers, to transfer dance movements into digital data in such a way that makes it possible to blend any specific motion element with any other motion element within the motion capture database.

This will allow WhoLoDance to deliver varied combinations of dance moves contained in a teaching syllabus and its Multimodal Rendering, based on the use of Life-size Holograms or other volumetric projection display methods, as well as on touch feedback, spatial audio, and abstract visualization that focuses on the peripheral vision and the “sense of self” for the dancer. By applying HLFs analysis, sequence similarity and live-indexing clustering methods, WhoLoDance will also make it possible to develop and train multi-modal feature extraction algorithms and related applications (search, real-time feedback, classification, automated annotation, etc.) leading to automatic identification and comparison of dance patterns and styles.

Dancer applying sensors

Project objectives

WhoLoDancE aims to develop and apply breakthrough technological tools that will assist  dance teachers, students, choreographers, professional dancers and researchers in their desktop and dance studio work, stimulating their innovative thinking and creativity. The main objectives of the project are summarised below:

  1. Develop a large library of dance movements based on data acquired through motion capture (mocap) sessions and annotated in a manner that allows data interpolations, extrapolations and synthesis, making it possible to preserve cultural heritage, and in the long-term creatively enrich it.
  2. Develop a ‘blending engine’: a powerful tool that will allow choreographers and dance teachers to blend and assemble an infinite number of dance motions from the library of movements, stimulating the development of novel choreo graphic methods.
  3. Automate the analysis of expressivity and movement qualities in nonverbal  dance  data by applying similarity search tools and techniques for expressivity analysis, opting to facilitate the investigation of movement principles and vocabularies, mental imagery and simulation connected to dance practises, and stimulate the development of new research domains.
  4. Develop life size volumetric displays (avatars) of dance masters’ motions that will enable dancers to self-assess their own body alignment and technique by comparison, stimulating the development of novel teaching and learning methods.
  5. Provide access to the developed library of movements through commercially available consumer grade motion capture devices like the MS Kinect, Intel’s real sense and others that will be easily accessible by a wide audience.
 Queen’s Award for Enterprise Logo
University of the year shortlisted
QS Five Star Rating 2023