Bringing the magic of playing music to the virtual world
The Joint Active Music Sessions (JAMS) platform has the potential to develop a social network where musicians can learn, connect, and perform.
BIRMINGHAM, WEST MIDLANDS, UNITED KINGDOM, January 2, 2025 /EINPresswire.com/ -- Researchers are aiming to bring the magic of playing music in person to the virtual world.
The Joint Active Music Sessions (JAMS) platform, created at the University of Birmingham, uses avatars created by individual musicians and shared with fellow musicians to create virtual concerts, practice sessions, or enhance music teaching.
Dr Massimiliano (Max) Di Luca from the University of Birmingham explains: “A musician records themselves and sends the video to another musician. The software creates a responsive avatar that plays in perfect synchrony with the music partner. All you need is an iPhone and a VR headset to bring musicians together for performance, practice, or teaching.”
The JAMS platform has the potential to develop a social network like Spotify or Myspace, where musicians can interact to learn, connect, perform, develop new music, and create virtual concerts that reach larger audiences.
JAMS has the distinct flavour of a platform developed with and for musicians whether successful or at an early stage of learning.
The avatars capture the unspoken moments that are key in musical performance, allowing practice partners or performers to watch the tip of the violinist’s bow, or make eye contact at critical points in the piece. They also have real-time adaptability and are dynamically responsive to the musician on the VR headset, so delivering a unique, personalised experience.
Delivery by VR headset recreates the musician’s world and provides an immersive backdrop with a realistic rendering of other musicians and cues used in the real-life setting. It also keeps the faces at eye level, which adds to the feeling of connectedness.
Critically, there is no ‘latency’ in the JAMS user experience. Dr Di Luca explains: “Latency is the delay between a sound production and when it reaches the listener, and performers can start to feel the effects of latency as low as 10 milliseconds, throwing them ‘off-beat’, breaking their concentration, or distracting them from the technical aspects of playing.”
JAMS is underpinned by an algorithm created during the Augmented Reality Music Ensemble (ARME) project, that captures dynamic timing adjustments between performers. The project brought together researchers from six disciplines (psychology, computer science, engineering, music, sport science, and maths), whose input realised the vision of building a computational model that reproduces, with precision, a musician’s body movements and delivers an avatar that meets the needs co-performers.
“We’re aiming to bring the magic of playing music in person to the virtual world. You can adapt the avatar that other people play with, or learn to play better through practice with a maestro.”
JAMS allows musicians to perform in an interactive virtual group, and can be adapted for lipsyncing or dubbing in media. It can also gather unique user data to create digital twins of musicians, offering licensing opportunities for various applications, and further exploitation of catalogues and publishing rights.
Commercial enquiries should be directed to the ARME project website at: https://arme-project.co.uk/contact/
Ruth C Ashton
University of Birmingham Enterprise
r.c.ashton@bham.ac.uk
Visit us on social media:
LinkedIn
Distribution channels: Amusement, Gaming & Casino, Conferences & Trade Fairs, Education, Music Industry, Social Media
Legal Disclaimer:
EIN Presswire provides this news content "as is" without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the author above.
Submit your press release