Co-creation Services for Shared (and Intangible) Cultural Engagements
Remote Performance Platform for Collaborative Music-making
​
The participants involved in the development of the Remote Performance Platform, will co-create music together at a distance. Research, technological development, and collaborative music-making will happen more or less simultaneously in the process. Since the platform is being developed for remote co-creation and online experiences, most participants can engage in the research process from their home.
​
The research that will be performed in developing the Remote Performance Platform also explores the interactive possibilities of the technology used. Not only will research participants be able to explore the sound of music but explore the feeling of music. By including the communication of thoughts and emotions between participants, we aim to develop the remote co-creative artistic process.
​
The Technology
​
The JackTrip technology that has been developed at CCRMA, Stanford University, is one part of this platform. JackTrip offers low latency to musicians collaborating online, enabling them to stay synchronized, minimizing the issue of delayed sound. Since the sound passes through the JackTrip technology, we can also experiment with the digital room and its resonance. We will also explore haptics and biodata. Through technology and neuro analysis we are able to investigate how we can transfer feelings and a state of mind to one another remotely. By using sensors, data will be collected through the signals of the nerve and muscular system and respiration.
​
Disability a Crucial Experience for Development
​
MuseIT, aims to co-design, develop, and co-evaluate, all its developments. In WP5 we carry out many participatory co-design and co-creation activities and we have engaged a group of participants that will help the researchers in developing the platform. The uniqueness of each individual’s experience of disability is crucial as this platform aims to be accessible and offer co-creative cultural experiences regardless of ability. The Remote Performance Platform is focused on music-making, however, the technology used can be applied to other forms of art. The use of multimodal technologies will open up enhanced experiences for everyone, an important aspect towards equal access to arts and culture.
​
Our goal is to develop low latency technologies to facilitate co-creation services for creating born digital content. Music is an essential part of cultural heritage and one of the most challenging to transmit multimodally. Music can consist of tangible cultural heritage such as notation, instruments, and recordings. However, a music performance – and most notably, traditional songs that are passed on through generations, like the Sami yoik – is intangible heritage. This is why music is an integral part of MuseIT. If we are able to create multimodal experiences of music, we will be able to do it with other art forms as well. Our user-centred and co-creative approach allows for exploring how it feels to engage in creating art as well as experience the art.
This recorded improvisation captures a unique interaction between musician Ewe and the AI-driven software Somax2, developed by IRCAM in Paris. The session showcases the capabilities of Somax2, as it dynamically analyses and generates music in real time, allowing Ewe to create an evolving soundscape. The AI system, controlled by IRCAM researcher Mikhail Malt, facilitates remote musical collaboration by responding to Ewe's input and shaping the musical flow with its own generative responses. The recording offers a glimpse into how artificial intelligence can inspire new forms of creative expression, pushing the boundaries of traditional music-making.