search.noResults

search.searching

dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
Prof. Shannon Vallor is set to take up the ethics of data and AI role early next year


“It’s really important that design teams be inclusive and diverse so that we have all of the perspectives on, for example, who the user of a technology might be, so that we are really able to cover one another’s blind spots, so to speak.”


Vallor says: “One of the areas


that have become really centrally important is to understand the ways that automated decision- making, especially when driven by large amounts of data, that are used to train machine learning or AI models, is that we’ve seen there are serious challenges in ensur- ing those decisions don’t amplify or perpetuate unfair biases that exist in society, whether those are gender biases, race biases, biases against certain age groups, biases against people who come from a particular region, biases against people with disabilities.” But Val- lor is keen to add the “qualifica- tion” that the unfairness doesn’t come, per se, from an unhealthy attitude among the people train- ing the algorithms, it’s quite often simply an unintended conse- quence of their actions, but can be


mitigated by, for example, greater diversity among the technologists and also the subjects of the data. She adds: “It’s really important


that design teams be inclusive and diverse so that we have all of the perspectives on, for example, who the user of a technology might be, so that we are really able to cover one another’s blind spots, so to speak. We all have blind spots and all designers and developers of technology are familiar with the world through a certain lens, and sometimes when you have a team who are only looking through one narrow lens it’s easier to miss some of the risks that a technol- ogy might pose.”


Vallor arrives in Edinburgh at an interesting time for AI. Te Scot- tish Government has recently launched its intention to create a


national AI strategy, which will come to fruition in September next year, and roughly half the investment in the City Region Deal is focused on data driven innovation. Tese public sector initiatives will be among her “first priorities”, she says, on taking up the new post, as she believes government has a big role to play in shaping new AI technologies, as well as using it to deliver its own services in future. She will also help identify interesting areas for her PhD candidates to focus their research on, and is excited about filling a professional gap – where demand for AI ethi- cists is growing in industry. She thinks Scotland is in a good place to learn from some of the “mistakes” that have been made by Silicon Valley in its desire,


perhaps, to ‘move fast and break things’. A more cautious, con- sidered and inclusive approach to using AI – which is grounded in consultation and having a “public conversation” about the risks and benefits – will be key moving forward. But above all, as we approach


the end of our Skype call, she is keen to present a sunnier (per- haps fittingly as a Californian) vision of AI, which isn’t neces- sarily going to lead to the rise of the machines. Of the generation that grew up with the Termina- tor movies, she insists AI is not about to ‘develop into Skynet’ and that our jobs are not going to be taken over soon by robots. As someone who also grew up with Arnie as a sci-fi hero, it is an optimism I’m happy to share. But I’m sure we’ll be back. l


AUTUMN 2019 | 19


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36