
E2-Create - Encoding Embodied Creativity for Computational Art
Funder
Horizon 2020 - Marie Sklodowska-Curie grant agreement No 840465
Collaborators
C-DaRE, Coventry University (lead)
Secondment partners are:
Motion Bank, Hochschule Mainz, Germany
Instituto Stocos, Spain
Project overview
Dance represents a rich resource of bodily expertise that is exciting and challenging for other scientific and artistic domains to draw from. E2-Create addresses this challenge by providing generative approaches to facilitate the exchange between dance and computer-based art. E2-Create places a strong focus on the combination of software development and artistic creation informed by recent progress in dance digitisation, machine learning (ML), and generative art.
Project Objectives
E2-Create has four main objectives:
Gain an understanding of principles of embodied creativity. Evaluate existing ML models and computer simulations for their suitability in dance. Develop new ML models and computer simulations for dance creation. Disseminate project results among artists, scientists, students and the general public.
Impact statement
Impacts to Society
E2-Create highlights how tightly dance, as embodied creativity, and digital technology practices can be intertwined to achieve a high level mutual exchange between the two for research, development, and creation. This has the potential to increase public awareness for embodied forms of knowledge and its importance for research and development of digital technology.
Impacts to Art
E2-Create makes its main impact on the artistic fields of Dance and Technology, Creative Coding, and Generative Art.
Practitioners in Dance and Technology employed software and sensors to translate their expressivity into music and light and to choreograph and rehearse with artificial dancers.
Creative Coders employed ML models to develop generative musical instruments and interactive systems that detect or generate dance movements.
Practitioners in Generative Art were provided with generative systems that illustrate how bodily creativity can be abstracted and how ML techniques and traditional generative methods can be combined.
Impacts to Science
E2-Create makes its main impact on the academic fields of Movement and Computing, Human Computer Interaction, and Computational Creativity.
Scientists in Movement and Computing were provided with motion capture recordings of professional dancers, with procedures for deriving higher level movement qualities, and with generative methods for simulating these qualities.
Scientists in Human Computer Interaction were provided with methods for establishing intuitive forms of interaction with ML models and with sensors for exploiting minute body movements as interaction modality.
Scientists in Computational Creativity were provided with an ML model that paves the way for future research on how ML benefits from the creativity employed by dancers.
Outputs
-
Publications
Publications
Bisig, D., Wegner, E., and Kimmig, H. “Strings P“. Proceedings of the 9th Conference on Computation, Communication, Aesthetics & X, July 12-16, Graz, Austria, 2021 https://www.researchgate.net/publication/353447184_Strings_P
Bisig, D., “Granular Dance”. Proceedings of the 9th Conference on Computation, Communication, Aesthetics & X, July 12-16, Graz, Austria, 2021 https://www.researchgate.net/publication/353447100_Granular_Dance
Bisig, D. and Tatar, K., “Raw Music from Free Movements: Early Experiments in Using Machine Learning to Create Raw Audio from Dance Movements”. Proceedings of the 2nd Conference on AI Music Creativity, July 18-22, Graz, Austria, 2021 https://www.researchgate.net/publication/353447404_Raw_Music_from_Free_Movements_Early_Experiments_in_Using_Machine_Learning_to_Create_Raw_Audio_from_Dance_Movements
Bisig, D. and Wegner, E., “Puppeteering an AI - Interactive Control of a Machine-Learning based Artificial Dancer”. Proceedings of the XXIII conference on Generative Art, December 15 – 17, Cagliari, Italy, 2021 https://www.researchgate.net/publication/356788649_Puppeteering_an_AI_-_Interactive_Control_of_a_Machine-Learning_based_Artificial_Dancer
Bisig, D., “Expressive Aliens - Laban Effort Factors for Non-Anthropomorphic Morphologies”. Proceedings of the International Conference on Computational Intelligence in Music, Sound, Art and Design (Part of EvoStar), April 20-22, Madrid, Spain, 2022, pp. 36-51. https://www.researchgate.net/publication/359969230_Expressive_Aliens_-_Laban_Effort_Factors_for_Non-anthropomorphic_Morphologies
Bisig, D. and Wegner, E., “Puppeteering AI - Interactive Control of an Artificial Dancer”. Proceedings of the Generative AI and HCI - CHI 2022 Workshop, May 10, New Orleans, USA, 2022 https://www.researchgate.net/publication/360950859_Puppeteering_AI_-Interactive_Control_of_an_Artificial_Dancer
Bisig, D., “Generative Dance - a Taxonomy and Survey”. Proceedings of the 8th International Conference on Movement and Computing, June 22-24, Chicago, USA, 2022 https://www.researchgate.net/publication/361550895_Generative_Dance_-_a_Taxonomy_and_Survey
Bisig, D. and Romero, M., “Translating Idiosyncratic Movement Qualities”. Proceedings of the 11th EAI International Conference: ArtsIT, Interactivity & Game Creation, November 21-22, Faro, Portugal, 2022 https://www.researchgate.net/publication/361550895_Generative_Dance_-_a_Taxonomy_and_Survey
Bisig, D., “Practical Resources for Developing Idiosyncratic Generative Systems for Dance”. Proceedings of the XXV Generative Art Conference, December 12-14, Rome, Italy, 2022 https://www.researchgate.net/publication/365791632_Practical_Resources_for_Developing_Idiosyncratic_Generative_Systems_for_Dance
-
Performances
Strings P concert: https://youtu.be/eUwZuc2OxHs
Artificial Intimacy: http://artificial-intimacy.dance/
Embodied Machine: https://embodied-machine.motionbank.org
-
Websites/ Blogs
Project Blog : https://wp.coventry.domains/e2create/
Project Educational Material: https://wp.coventry.domains/e2edu/
Documentation Creative Process "Artificial Intimacy": http://artificial-intimacy.dance/
Documentation Creative Process "Embodied Machine": https://embodied-machine.motionbank.org
Interviews with dancers from Staatstheater Mainz: http://www.sdela.dds.nl/motionbank/cristel_zach_amber_interviews/#/
-
Datasets
Dance Data and Interviews with dancers from Staatstheater Mainz: https://zenodo.org/record/7031672
Dance Data and Movement Quality Descriptions from Choreographer/Dancer Muriel Romero: https://zenodo.org/record/7034917
-
Software
Qualisys to OSC Converter: https://github.coventry.ac.uk/ad5041/QTM_to_OSC
Motion Capture Player: https://github.coventry.ac.uk/ad5041/MocapPlayer
Motion Capture Analysis Software: https://github.coventry.ac.uk/ad5041/MocapAnalysis
Motion Capture Analysis Library: https://github.coventry.ac.uk/ad5041/ofxDabDataProc
Video Tracking Library: https://github.coventry.ac.uk/ad5041/ofxDabVideoTracker
Granular Dance: https://github.coventry.ac.uk/ad5041/GranularDance
Puppeteering AI: https://github.coventry.ac.uk/ad5041/PuppeteeringAI
RAMFEM: https://github.coventry.ac.uk/ad5041/RawMusicFromFreeMovements
https://zenodo.org/record/4656086
Expressive Aliens: https://zenodo.org/record/5604902
Strings: https://github.coventry.ac.uk/ad5041/ofxDabSpring
MOQUAM: https://github.coventry.ac.uk/ad5041/ExpressiveAliensSimOnly