search.noResults

search.searching

dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
One of the problems with artificial intelli- gence is that it is not aimed at the worthiest of the world’s problems.


knowledge being power. So, if you have that sort of data, it’s critical to create ethical frameworks around it. And I don’t want to look back in three years and think that the best technology of our time was used to make people click more ads.”


Old problems


“A big part of the problem is that the people who create this technology lack diversity. That manifests itself in the tech- nology and also the data.”


One manifestation is how male and female personas are given to particular AI assistants.


“A lot of these assistants are given fe- male personalities if they’re doing things like ordering the shopping or playing your favourite music – Siri or Alexa – rather than Ross the robot lawyer, where a male persona is making important decisions.” It’s a pervasive problem where children


shout demands at female bots and listen to male ones. And then the problem goes deeper. “Let’s say, if you were building some- thing like Siri and you trained it on Wikipedia to build its knowledge. If you do that, it’s going to learn that only 17 per cent of notable people’s biographies are of women.


“The point is that if the data is skewed, the machine does not have a better way of knowing it. Another example is the MIT study of facial recognition systems where there is a 0.8 per cent error rate


March 2019


for light-skinned men, and a 34.7 per cent error rate for darker skinned women. They failed to recognise Michelle Obama and Oprah Winfrey. It’s a big issue because this technology is being used in policing and criminal justice. So there are a lot of biases because AI is trained on a certain kind of data set.”


New solutions


Misleading datasets and poor representa- tion can produce misguided artificial intelligence. It means the positive potential of AI is offset, even negated by its ability to amplify old prejudices. AI For Good, an organisation Kriti founded, runs projects


Kriti Sharma


where machines are created to provide friendlier, less judgemental sources of in- formation than their human counterparts. “In one of my projects in India we are working on sexual and reproductive health information for young people. We’re using AI to provide access to the right informa- tion to the right people at the right time. Young people historically have struggled to get access to vital information about it. It’s very awkward and sometimes socially unacceptable to access this information but algorithms, designed with the right controls and the right experts in the room, with a safety first approach, can bridge that gap between the young and information. “Another very interesting example is a bot we launched in November, called rAInbow, to help victims of domestic violence in South Africa or those at risk of it. Historically victims would have to call a helpline and talk to a human. “There were major issues about things like stigma and judgement and victim blaming and helplines not being open all the time – the fact that victims or survi- vors wanted to take action at their own pace in their own time. So we built this non-judgemental machine that was there to give them access to the information and designed to be empathetic. It doesn’t have empathy, but it is designed to be empathetic, and the result is that we had over 150,000 conversations in 90 days from launch and it’s really working as a system which the people suffering actually want rather than something that is given


INFORMATION PROFESSIONAL 21


Kriti Sharma pp20-22.indd 5


07/03/2019 12:11


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56  |  Page 57  |  Page 58  |  Page 59  |  Page 60