search.noResults

search.searching

saml.title
dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
20 IBC PRESENTS


STRONGERTOGETHER


#IBC2021


THIS YEAR’S TECHNICAL PAPERS


IBC is world-renowned for the quality, timeliness and innovative subject matter of its Technical Papers. They provide an opportunity for technologists and companies to unveil their ideas and research to media industry leaders hungry for new technology concepts, their possible uses and practical applications. The Technical Papers Programme received entries from across the media, entertainment and technology sector, spanning subjects such as facial recognition, networked homes, AI in media and the cloud for live production. This year, all of the accepted papers are available for download via IBC365 while presentations comprised of multiple papers, listed below, are available on IBC Digital.


FACIAL RECOGNITION: VARIOUS FACETS OF A POWERFUL MEDIA TOOL


Facial recognition is one of today’s most controversial media technologies. Recent claims are that it can reliably recognise subjects wearing sunglasses or medical masks, and that it can even differentiate between identical twins. Not all of its many possibilities are sinister, however. In this session, viewers are shown how facial processing is already employed by a global broadcaster and that it is demonstrating its value in several areas of live broadcast production, such as assisting a commentator to recognise each of 500 runners in a road race. A second presentation explores how a convolutional neural network has been trained to classify facial expressions, allowing it to recognise and tag emotion in video material. This provides an entirely new way of categorising and searching drama and movie content.


CUTTING EDGE TECHNOLOGIES: A PREVIEW OF SOME EXPERIMENTAL CONCEPTS


This session examines three new ideas which are thought-provoking and potentially media game-changers. First, the robot companion who watches TV with family and friends, and shares the fun and the emotional engagement. Implemented as an autonomous, free-standing character, the robot attends to the screen and enjoys discussing the content.


The next on-going experimental development concerns a live performance where musicians in a concert hall will play together with itinerant musicians who are walking through the streets of the city. We chart the considerable technical diffi culties involved in conveying several low-latency audio and 4K video signals through 5G networks, together with the associated production and artistic challenges. The third initiative is a novel means by which an individual’s entire media


streaming history, from all their service providers, can be used to provide personal recommendations without any contributing service needing to hold or process their data.


AI IN MEDIA PRODUCTION: CREATING NEW MARKETS FOR LINEAR CONTENT


For decades, broadcasters have been producing linear programmes, such as news, magazines or documentaries, which contain valuable audio-visual information about a vast variety of individual topics. The problem is that these individual topics are often neither addressable nor fi ndable. Could AI and machine learning segment or chapterise this archived material so it would be reusable in the interactive digital world? Might AI even be able to re-edit it into personalised media? We look at a fascinating project which is doing all this and more. Improvement is still required, especially in the editorial challenges for AI of creating recompiled media, but public-facing trials are underway and generating much interest. Also key to the reuse of these repurposed assets is the recognition of the diversity of today’s video delivery platforms, in particular social media. AI can be used to cleverly target particular content offerings across platforms according to: predictions of audience interests; trending stories; particular localities; anniversaries, etc – all achieved through news scanning and online trend monitoring. Content can also be automatically adapted to suit the style and culture of each platform. Visit IBC Digital to hear from an ambitious European project which is seeking to optimally craft and distribute video across a diversity of channels.


ADVANCES IN AUDIO: USING SOME REMARKABLE SIGNAL PROCESSING


ORCHESTRATED DEVICES: A VISION OF NETWORKED HOME ENTERTAINMENT


The number of devices in the modern home that are capable of reproducing media content is already considerable. However, a user will typically employ only one when enjoying a particular form of entertainment. Suppose that all the devices in the home were connected and synchronised through an internet-of-things-type network; it would then be possible for the home to come alive, as the same entertainment could engage with multiple elements: screens, speakers, mobile phones, shaking sofas – even domestic lighting and smart appliances. This session includes how a prototype audio orchestration tool has been designed and trialled on several productions to evaluate the principle of creative orchestration, with extremely positive results. In a second presentation, intelligence within the home automatically orchestrates the incoming media across the available devices taking account of the content, the environment and the wishes of the user.


Every broadcaster knows that the most common complaint from viewers is that programme dialogue is hard to discern against a background of atmospheric sounds, mood music and competing voices. It is especially a problem of age, where 90% of people over 60 years old report problems. In this session, the results of trials by a collaboration of researchers using their deep-neural-network-based technology across a wide range of TV content and age groups are presented. Exciting research using cloud-based AI and 5G connectivity to deliver live immersive experiences to a variety of consumer devices is also presented. Key to the experience is the ability of viewers to change their content viewpoint, with live rendering taking place in the cloud.


THE CLOUD: FOR LIVE AND PRODUCTION WORKFLOWS


Cloud-based production is revolutionising the working practices of journalists and entertainment media producers, allowing increased fl exibility of location and


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56  |  Page 57  |  Page 58  |  Page 59  |  Page 60  |  Page 61  |  Page 62  |  Page 63  |  Page 64  |  Page 65  |  Page 66  |  Page 67  |  Page 68  |  Page 69  |  Page 70  |  Page 71  |  Page 72