CLICK this icon for video


It is very tempting to think of immersive audio as a done deal within the industry. Great strides have been made in audio reproduction in recent years, with consumer deployments of 5.1 and Dolby Atmos becoming increasingly mass market and no longer simply the preserve of the AV enthusiast. However, as this fascinating Accelerator focusing on Immersive Audio and Sound Imagery proves, there is a lot that can still be achieved. New object-oriented audio technologies not only raise the bar for high-end audio reproduction, but also hold out the potential to vastly improve a range of areas of more pragmatic interest to broadcasters. “We see the delivery of object audio not in order to

fl y it around the living room, but so viewers can make their own mixes and be able to choose from a list of languages,” comments Karl Petermichl, technische direktion at Austrian broadcaster ORF, one of the co- Champions on the project. The project has a broad range of Champions from industry and academia working together to produce three binaural POCs that will showcase some of the new possibilities engendered by immersive audio from live music performance through 360-video and on to spoken word dramatisations.

Initially pitched at the IBC Accelerators Kickstart event by independent media consultant and project lead Benjamin Schwartz, his DIY soundscape experiment ‘Being There’ helped inspire the wider scope of this challenge. “I was stranded at my mother’s house in March 2020 and deeply disappointed to miss a concert due to lockdown,” he explains. “I was looking for a solution to address millions stranded like me in a house with just a regular HD TV and stereo sound. I wanted a solution that would make a difference for everyone. Knowing that the sound mix in music videos is permanently fi xed during a performance, even if there are varied camera angles, I started working on the ‘Being There’ project: breaking that un-said rule of a static audio mix.” To further explore where cutting-edge audio innovation can be applied to this basic concept, one area of the POC will focus on a fully immersive music experience, showcasing six degrees of freedom, room spatialisation and 360VR elements. The idea is that dynamic soundscapes will accompany the camera and, thanks to input from project Participants Omnilive (video) and MagicBeans (audio), the viewer will be able to select the camera angle and even move around within the 360-degree space. As they do this the audio perspective will match their movement, thanks in part to the pre- release 6DOF volumetric audio feature of MagicBeans’ forthcoming RoundHead software. The key software brought into the Accelerator with the

express aim of achieving the spatial mix is, a spin-out from co-Champion King’s College London, which has developed a subset of technologies that it hopes will be able to underpin the next generation of audio. In addition to its application within the music R&D of this


“We see the delivery of object audio not in order to fl y it around the living room, but so viewers can make their own mixes and be able to choose from a list of languages,” Karl Petermichl, ORF

Immersive Audio and Sound Imagery Champions: CTOI, ORF, BBC, Audible, Twickenham Film Studios, King’s College London, The Audio Engineering Society (AES), University of Surrey, University of Lethbridge, MuseumTV Participants: Omnilive, Magic Beans

Accelerator, it will also be implemented into the POC’s spoken word use case. This will focus on a multi-narrative audio play, specially commissioned by fellow Champion Audible and recorded at Twickenham Film Studios, another co-Champion of this challenge.’s co- founder is Zoran Cvetkovic, professor of signal processing at King’s, and he lists two main challenges in the fi eld of broadcast immersive audio. The fi rst, how to provide a stable auditory perspective for a listener moving around the listening environment, is beyond the scope of this Accelerator; hence the decision to present the POCs binaurally. The second is being very much tackled though and relates to techniques for capturing the room acoustics.

“When you observe an acoustic event in enclosed spaces, what you hear is the initial wave and then the fi rst order refl ections and second order refl ections and so on of the walls and objects,” explains Cvetkovic. “It is via the processing of the relationships between these different refl ections that the human auditory system creates a perception of where the objects in the space are. The more of these refl ections we render the better it is, but that is numerically prohibitive to be done with proper accuracy in real time and is made more diffi cult

by moving objects. However, we have developed technology that allows for synthesising most of it in real time using modest computational resources.”


The team behind originally developed the technology with classical music in mind before realising its potential in immersive environments. And it’s the prospect of taking that potential, and object-oriented audio in general, and applying it to the broadcast environment that has ORF’s Petermichl excited. “The way it automatically adapts to the actual listening

environment is very important for us,” he says. “In the era of smartphones and big TVs, you never know where your programme is listened to, on the underground with earphones in, or on a 100in smart TV in a big living room in a cinema-like experience. Synthesising and adapting a given confi guration to the actual point of listening is a really important task for us.” The project will also hopefully address four key issues for broadcasters. The perennial problem of variation in loudness and viewers not being able to hear the centre channel; improving language versatility; providing interactivity for narrative purposes; and being able to deliver stems to provide a more immersive mix. “We see our work as a stepping-stone to immersive technologies, which might require specifi c equipment at the customer end,” he concludes. “We don’t know if our current iteration of technology will just play a proof-of- concept role to advance the immersive agenda or be used for several years, maybe even commercially. That wasn’t our initial purpose. We just want to rediscover that wonderful sense of Being There when the audience understands the artistic intent of a live performer.”

For more information on the IBC Accelerator Media Innovation Programme, supported by Nvidia, visit ibc-accelerator-media-innovation-programme

Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56  |  Page 57  |  Page 58  |  Page 59  |  Page 60  |  Page 61  |  Page 62  |  Page 63  |  Page 64  |  Page 65  |  Page 66  |  Page 67  |  Page 68  |  Page 69  |  Page 70  |  Page 71  |  Page 72