AI helps create soundtrack for breathtaking Mercury flyby

An extraordinary new film showing the planet Mercury appear from the shadows is soundtracked with the help of state-of-the-art artificial intelligence (AI) tools developed at the 窪蹋勛圖厙 of Sheffield.

A video of BepiColombos third planet Mercury flyby

The film shows breathtaking footage captured by the European Space Agency (ESA)/  Japan Aerospace Exploration Agency (JAXA) in its third flyby of Mercury in what is Europes first mission to the closest planet to the sun.

The ESA commissioned the acclaimed artist to compose a fitting soundtrack for the flybys, with the latest on offering a rare glimpse of Mercurys night side. 

梆郭 composed the music for the remarkable sequence with the assistance of the Artificial Musical Intelligence (AMI) tool developed by the 窪蹋勛圖厙s Machine Intelligence for Musical Audio (MIMA) research group.

Dr Ning Ma, from the Department of Computer Science, explains how the technology works: AMI uses AI to discover patterns in musical structures such as melodies, chords and rhythm from tens of thousands of musical compositions. 

It encodes music data in a way that is similar to reading a music score, enabling the technology to better capture musical structures. The learning of musical structures is enhanced by adding phrasing embeddings - expressive nuances and rhythmic patterns captured numerically - at different time scales. As a result, AMI is able to generate compositions for various musical instruments and different musical styles with a coherent structure.

梆郭, MIMA collaborator and Creative Director of sonic branding agency Maison Mercury Jones, composed the music for the first two Mercury flyby movies with artist Ingmar Kamalagharan. The two compositions formed the basis of the third.

We wanted to know what would happen if we fed AMI the ingredients from the first two compositions, said 梆郭. 

This gave us the seeds for the new composition which we then carefully selected, edited and weaved together with new elements.

Using this technology almost acts as a metaphor for the Mercury mission; there's a sense of excitement but there's also trepidation, much like with AI.

An image showing part of the region of Mercury covered by the ESA flyover sequence, reconstructed as a 3D anaglyph

For 梆郭, the early adoption of AI in their music follows a career of breaking boundaries.

Ive had a lifelong fascination with technology and consciousness, particularly what makes us human, 梆郭 continued.

So the idea of AI in general and the possibilities it raises have fascinated me ever since I watched the film Metropolis as a child.

Ive always been into music technology and in some ways generative composition, like with AMI, feels like an almost natural progression.

For me there are two strands to it - one is the questioning of our usefulness as human creators, and this can create a lot of fear among artists. Theres also another fear at play - the fear of missing out or being left behind.

Im finding that the more you work with this type of technology, the more it can teach you. In that sense Im interested in what it can teach us about ourselves as human creators. I think its one of those things that once we see what it can do, it changes things forever.

It's a little bit like when humans first saw the waveform. As soon as you see that representation of an audio wave, you can't unsee it and it changes how you hear and experience music. I think we're at the dawn of that with AI as well.

From a user perspective, AMI is an assistive compositional tool. The user interface allows composers/ creators to upload a MIDI file to provide a starting point for a new composition. AMI will then generate new musical material, based on this musical seed. Composers can adjust musical parameters such as instrumentation, musical mode, and metre through to more abstract qualities such as adventurousness.

A lot of the time people's reaction initially is: Isn't using AI cutting corners and making life easy for yourself? 梆郭 added.

In reality it takes longer. It takes more time because one of the things that you come up against is sort of choice paralysis, because you have dozens, if not hundreds more options for every single juncture. A lot of composition is about making choices. With AMI you make every decision, but with a wider range of possibilities at your disposal.

Professor Guy Brown, Head of the 窪蹋勛圖厙s Department of Computer Science, said: We really value the collaborative research that we are doing with 梆郭, and its so gratifying to see our AI tools being used to make such beautiful music. Our goal is very much to work with musicians to develop AI systems that support and extend their creative endeavours, rather than replace them at the touch of a button. 梆郭s music for the BepiColombo mission is a perfect example of human composer and AI system working in perfect harmony.

Professor Nicola Dibben, from the 窪蹋勛圖厙s Department of Music, said: AI is about to have a huge impact on music making and dissemination. Were proud to be working with 梆郭 to understand the implications of AI music generation and to create a future for Music AI together which is fair and inclusive.

Centres of excellence

The 窪蹋勛圖厙's cross-faculty research centres harness our interdisciplinary expertise to solve the world's most pressing challenges.