WhoLoDance: Whole-Body Interaction Learning for Dance Education
H2020-ICT-2015-single-stage (Topic: Technologies for better human learning and teaching, ICT-20-2015) Research and Innovation Action
Total value of project
Value to Coventry University
Lynkeus SRL- Co-ordinator (Italy), Athena RC (Greece), Motek Entertainment (Netherlands), Politecnico di Milano (Italy), Università di Genova (Italy), Peachnote GmbH (Germany), INSTITUTO STOCOS (Spain), K. DANSE (France), Lyceum Club of Greek Women (Greece)
Duration of project
01/10/2016 - 31/12/2018
By applying Multimodal Sensing and Capturing Analysis, WhoLoDance will make use of advanced motion capture technologies as well as of EMG, bio-sensors, video, audio and accelerometers, to transfer dance movements into digital data in such a way that makes it possible to blend any specific motion element with any other motion element within the motion capture database.
This will allow WhoLoDance to deliver varied combinations of dance moves contained in a teaching syllabus and its Multimodal Rendering, based on the use of Life-size Holograms or other volumetric projection display methods, as well as on touch feedback, spatial audio, and abstract visualization that focuses on the peripheral vision and the “sense of self” for the dancer. By applying HLFs analysis, sequence similarity and live-indexing clustering methods, WhoLoDance will also make it possible to develop and train multi-modal feature extraction algorithms and related applications (search, real-time feedback, classification, automated annotation, etc.) leading to automatic identification and comparison of dance patterns and styles.
WhoLoDancE aims to develop and apply breakthrough technological tools that will assist dance teachers, students, choreographers, professional dancers and researchers in their desktop and dance studio work, stimulating their innovative thinking and creativity. The main objectives of the project are summarised below:
- Develop a large library of dance movements based on data acquired through motion capture (mocap) sessions and annotated in a manner that allows data interpolations, extrapolations and synthesis, making it possible to preserve cultural heritage, and in the long-term creatively enrich it.
- Develop a ‘blending engine’: a powerful tool that will allow choreographers and dance teachers to blend and assemble an infinite number of dance motions from the library of movements, stimulating the development of novel choreo graphic methods.
- Automate the analysis of expressivity and movement qualities in nonverbal dance data by applying similarity search tools and techniques for expressivity analysis, opting to facilitate the investigation of movement principles and vocabularies, mental imagery and simulation connected to dance practises, and stimulate the development of new research domains.
- Develop life size volumetric displays (avatars) of dance masters’ motions that will enable dancers to self-assess their own body alignment and technique by comparison, stimulating the development of novel teaching and learning methods.
- Provide access to the developed library of movements through commercially available consumer grade motion capture devices like the MS Kinect, Intel’s real sense and others that will be easily accessible by a wide audience.
The WhoLoDance project developed and applied breakthrough technologies to dance learning in order to achieve results that will have relevant impacts on numerous targets including, but not limited to, dance practitioners ranging from researchers and professionals to dance students and the interested public. By applying similarity search tools, computational models, emotional content analysis and techniques for the automated analysis of non-verbal expressive movement to dance data, the project will help investigate movement and learning principles, vocabularies, mental imagery and simulation connected to dance practices. It will also create a proof-of-concept motion capture repository of dance motions allowing interpolations, extrapolations and synthesis through similarity search among different compositions documenting diverse and specialised dance movement practices, and learning approaches.
A prototype life-size volumetric display will enable a dance student to literally step inside the dance expert’s body that through the use of immersive and responsive motion capture data, to identify and respond to collisions between the physical and virtual bodies. It aims to develop choreographic methods by building and structuring an interactive repository of motion capture dance libraries. A dance data blending engine will give choreographers and dance teachers a powerful tool to blend and assemble an infinite number of dance compositions.
To see the project videos visit the WholoDance Vimeo channel.
Publications, reports, films, academic publications, policy documents and dance performances
Cisneros, R. K. Stamp, K., Whatley, S., Wood, K (2019) ‘WholoDancE: digital tools and the dance learning environment’ in Research in Dance Education, vol. 20: 1. pp 54-72.
Cisneros, R. K. Wood, K., Whatley, S., Buccoli, M. Zanoni, M., Sarti, A. (2019) ‘Virtual Reality and Choreographic Practice: The Potential for New Creative Methods’ in Body, Space, Technology, vol 18: 1.
Rizzo, A., El Raheb, K., Katifori, A., Ioannidis, Y., Cisneros, R., Buccoli, M., Zanoni, M., Markatzi, A., Camurri, A., Piana, S., Viro, V. Whatley, S, (2018) WhoLoDance: Whole-body interaction Learning for Dance Education, EuroMed, published proceedings.
Camurri, A., Whatley, S., El Raheb, K, (2018) A Conceptual Framework for Creating and Analysing Dance Learning Digital Content, MOCO published proceedings.