search.noResults

search.searching

saml.title
dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
30


IBC2021 ACCELERATOR: AI BIAS DETECTION


As we move on from AI’s initial role in the media space being primarily limited to automation, the rapidly evolving technology is spawning a plethora of new use cases. One of the most intriguing areas of current research is the subject of one of the IBC2021 Accelerators, in which a powerful range of broadcast and news organisation Champions have been investigating its potential for detecting bias in news reporting.


The AI Bias Detection Accelerator builds on the AI- enabled Content Moderation Accelerator project of 2020. Led by Al Jazeera, this year the team has been examining how AI can be used to detect, measure and fl ag bias in the representation and portrayal of diverse genders, cultures and ethnicities to ensure fairness and transparency in news reporting.


The Accelerator has looked at how the technology can be used to preserve and protect the fundamental notions of neutrality and balance, which is key to the reputation – and, in some cases, perhaps even survival in choppy political climates – of public broadcasters and news organisations around the world.


A TASK THAT SCALES “As the world’s biggest and oldest news organisation getting it fi rst and getting it right – speed and accuracy – are two of the pillars on which this temple is built,” says Sandy Macintyre, vice president news at The Associated Press. “But the third is being fair, balanced and impartial, and therefore avoiding both intended and unintended bias and being extremely careful in our tone.” Uniquely, AP has been working collaboratively with Reuters as well as with other world-leading news organisations and broadcasters in the project. “There are all kinds of biases; coverage bias, selection bias, gatekeeping bias and, obviously for a POC, there’s just far too much to do in a meaningful way, so we’ve decided to zone in on tonality or tone,” explains Dr Niamh McCole, broadcast compliance specialist at RTÉ. “The starting point is the recognition that the language of news broadcasting is a powerful way of conveying very subtle meaning and is a signifi cant means to persuade, to endorse, to contradict, or to cast out.” While acknowledging that language choices are


reinforced by visual elements, whether that be human expression and gesture or choices made in an edit suite, the Accelerator has concentrated on analysing text. This is still a fearsomely complex task. Yves Bergquist is director of the Data and Analytics Project at the University of Southern California’s Entertainment Technology Center and is heading up the programming of the AI. “The words we use are very indicative of our ideology and our opinions about the events that we’re describing,” he says. “Whether we say the word ‘regime’ for example, or government, those are two different words with two different connotations.”


AIs, of course, do not fundamentally understand the nuance between the two words so have to be trained. Two different methods are being used in the Accelerator. The fi rst is supervised learning where the application is trained on massive amounts of data that has been hand-labelled by humans. It also uses sentiment analysis to detect


CLICK this icon for video


“Using a combination of hybrid approaches and algorithms to solve a problem tends to outperform simply using one model. And that’s basically what we’re trying to do,” Yves Bergquist, University of Southern California


AI Bias Detection Champions: Al Jazeera, AP, BBC, Reuters, RTÈ, ETC (University of Southern California), Multichoice


emotional tonalities across a wide number of fi elds. Is the person being aggressive or happy in what they say? Is the person in a position of power or not (those that are tend to use the pronoun ‘we’, people who are in positions where they feel disempowered tend to use ‘I’, and so on). The second technique is unsupervised machine


learning. This is basically clustering. Text is input with no annotations, but the application will recognise clusters of words per topic and per news organisations. So it can say that News Organisation A is using ‘regime’ to describe the Afghan government more than News Organisation B, which is using the word ‘government’. In practice, both methods are being used for the Accelerator. “I think if the fi eld of AI has learned anything over the past 10 or 15 years it is that what we call ensemble models tend to work a lot better,” says Bergquist. “Using a combination of hybrid approaches and algorithms to solve a problem tends to outperform simply using one model. And that’s basically what we’re trying to do.”


EXAMINING THE FALL OF KABUL


The POC is based on using AI to examine the coverage of a single event by multiple news organisations, with the fall


of Kabul from 15 August onwards chosen as one example. “We have been looking at the way in which news packages dealt with that event and its aftermath in terms of the corpus of words used, the quotes that broadcasters chose to use, and the language that’s included in the selection of the editing of the interviews,” says McCole. There are two things worth pointing out here. One is that it is vital that such a tool, and defi nitely a productised version in the future, is open and transparent. Macintyre says that what goes into the box, the data and the algorithms that power it, need to be open, and we need to be honest about what it can’t do. In that way when news organisations are accused of bias, or if they want to check their own output against reference markers, the whole process takes place in the open and can be examined for any faults or discrepancies. The second is that this is a tricky subject with many sensitivities. “The truth is disruptive,” says Bergquist. “To confi rm your own cultural biases is uncomfortable and kudos to the organisations in this Accelerator for putting themselves in such an uncomfortable place because they risk being confronted with their own limitations and biases. I think it’s really great to have big news organisations jumping into this deep end of the pool.” The team caution that what will be seen in the culmination of the 2021 phase of the project and outputs at year end are not going to be a fi nished product in any way, but it will provide important insights as to what the technology can do and point towards some key possibilities to come.


For more information on the IBC Accelerator Media Innovation Programme, supported by Nvidia, and to watch the presentations on IBC Digital, visit digital-ibc.expoplatform.com/ page/accelerator-media-innovation-programme


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56  |  Page 57  |  Page 58  |  Page 59  |  Page 60  |  Page 61  |  Page 62  |  Page 63  |  Page 64  |  Page 65  |  Page 66  |  Page 67  |  Page 68  |  Page 69  |  Page 70  |  Page 71  |  Page 72