70 HOW TO BUILD TRUST IN THE AGE OF AI
BBC R&D’s Antonia Kerle speaks to John Maxwell Hobbs about the battle against disinformation and how the broadcaster is responding to emerging tech
Antonia Kerle, Chief Technical Advisor at BBC R&D, took part in the IBC Conference panel ‘Fighting Disinformation and Disengagement: Staying relevant in the digital age’, alongside representatives from Euronews, News UK, Al Jazeera and PA Media Group on Saturday. The panellists explored how broadcasters and platforms can maintain trust and relevance as audiences turn away from traditional news sources and disinformation fl ourishes online. As generative AI increasingly intersects with the widespread distribution of disinformation, few public service broadcasters are better placed to address the issue of trust than the BBC. Kerle is leading the organisation’s exploration of how emerging technologies interact with trust, truth and its public service mission. Her work cuts across strategy, policy, research and innovation, with a single driving question: how can the BBC adapt, lead and protect its audiences in a shifting digital environment?
At BBC R&D, Kerle leads a team that helps the broadcaster understand and respond to emerging technologies, from AI and quantum computing to immersive media and blockchain. “More importantly,” she says, “what the BBC can do strategically to respond.” Her role blends research with infl uence, aiming to shape both internal decision-making and wider industry debate.
SHAPING THE FUTURE Kerle’s team works across the BBC to shape how the organisation understands and prepares for technological change. Rather than focusing on product development, their remit is to offer strategic foresight and policy direction. “We work really closely with policy, strategy and our product groups,” she explains. “Our job is to provide strategic clarity on questions like, ‘What should we invest in?’, ‘What needs standards?’ and ‘Where is the risk to our public service mission?’” The team’s insights help to identify opportunities, assess risks and prioritise areas for innovation. Kerle sees their role as enabling the BBC to act with intention rather than reacting piecemeal to disruption. This isn’t without challenges. Requests from
across the organisation often exceed what limited R&D resources allow. “Sometimes you are a funnel for people’s hopes and dreams,” she says. Part of her role is helping to set
boundaries and focus effort where it will have the most impact. The team’s ultimate goal is to guide the
BBC’s technological evolution in a way that is strategic, coherent and rooted in purpose.
TACKLING DISINFORMATION Disinformation is not a new problem, but generative AI has turned it into an industrialised process. As generative models improve, the barriers to creating convincing fake content vanish. Kerle’s team has played a leading role in shaping the BBC’s response to these challenges. A major area of focus is content provenance.
“My team was part of the founding group
around C2PA [the Coalition for Content Provenance and Authenticity],” she says, referencing the global standard backed by the BBC, Microsoft, Adobe and others. In 2023, the BBC ran trials with BBC Verify to test how visible content credentials – metadata about the origin and editing history of media – affected audience trust. “We asked, ‘Did this help you trust this BBC content more?’ In the vast majority of cases, that had a really positive impact,” she says.
“We must work with communities, not just to say, ‘This is true, this is false,’ but to give them the tools to understand the online environment more broadly”
Another front is deepfake detection. The team is developing tools to support BBC journalists in evaluating suspect material. “We’re looking to enable journalists to query whether or not there are elements of it that sort of look off,” she explains. These tools help journalists make informed judgments about visual authenticity in real time. Kerle is particularly focused on shifting the conversation from reactive to proactive. “Historically, the conversation around a technical response to disinformation has been in the hands of the third-party platforms. For example, people have asked ‘What’s Facebook going to do about it?’ and ‘What’s Instagram going to do about it?’” she says. “But might there be more we, as a publisher ourselves, could do?” Yet she is clear-eyed about the limitations
Antonia Kerle, BBC R&D
of technology. “It’s more, in my mind, about this slow erosion of trust in the information ecosystem,” she says. “Technology alone can’t fi x that.” To that end, her team is also investing in media and AI literacy. “We must work with communities, not just to say, ‘This is true, this is false,’ but to give them the tools to understand the online environment more broadly.” A recent BBC R&D project focused on teaching AI literacy to young people. “It’s not about teaching them to code. It’s about understanding how algorithms infl uence what you see, and how your data drives what you’re shown,” she explains.
That same ethos applies internally. “One of the things that I’m very proud of is that, early on, our team and our AI research were involved in shaping the BBC’s thinking around generative AI.” Kerle’s team plays a central role in shaping how the organisation understands and adopts emerging tools. “Being able to bring that technical understanding of what is feasible versus what isn’t – that was really critical.” As the BBC considers how to integrate AI into production, editorial decision-making and operations, Kerle emphasises the importance of measured, values-driven implementation. “That is a much longer game,” she says. “If any organisation is thinking about how to use AI, it’s not enough just to have a chatbot that you use occasionally. It’s going to require a fundamental rethinking of how everything is done. And that is going to take time.”
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50 |
Page 51 |
Page 52 |
Page 53 |
Page 54 |
Page 55 |
Page 56 |
Page 57 |
Page 58 |
Page 59 |
Page 60 |
Page 61 |
Page 62 |
Page 63 |
Page 64 |
Page 65 |
Page 66 |
Page 67 |
Page 68 |
Page 69 |
Page 70 |
Page 71 |
Page 72