search.noResults

search.searching

note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
0 1 1 0 1 0 0 0 1 0 1 1 0 1 1 0 1 0 0 0 1 0 1 1


1 1 0 1 1 0 0 1 0 0 0 1 1 0 1 1 0 1 1 0 1 0 0 0 1 0 1 1


0 1 1 0


1 0 0 0 1


1 1 0


0 1 0 0


1 0 1


0 0 1 0 1


0 0 1


0 1 1 0


1 0 1 1 0


0 0 0 1


0 1 1


0 0 1 0


is exactly that: advice,” he says. “But since clients now have access to the same information, advisers who merely buy and sell stocks will become few and far between.”


The Big Brother of our time? Algorithms came out of the shadows thanks to Google and other internet giants such as Facebook, Amazon and Netflix, which founded their business models on these calculation formulas and built massive databases along the way. “Product discoverability has been a game changer.


Before, consumers had to find the cultural products they wanted to read, watch or listen to on their own,” says Jonathan Roberge, Canada research chair in new digital environments and cultural intermediation at the National Institute of Scientific Research in Quebec City. “Today, it’s the technological platforms that find consumers.” According to Roberge, we are clearly living in an algorith- mic culture. Netflix, with its 109 million users in more than 190 countries and with more than 140 million hours of movies, TV shows and documentaries watched daily, is the world’s leader in online entertainment. It’s also the ultimate algorithmic machine, says Pierre Bélanger, a pro- fessor at the University of Ottawa’s department of com- munication. Bélanger is especially interested in how technology is being integrated into the media landscape. “Netflix is powered entirely by algorithms. It’s fuelled by the billions of footprints we leave all over the web,” he says. “Once it gets to know us better, it recommends content that reflects our preferences.” Bélanger uses “netamorphosis” to refer to these powerful algorithms that influence our habits. “I worry that machines have begun to think for me,” he adds. The issue is all too real: should we be concerned about


the rapid proliferation of algorithms or their growing impact on our daily lives? “Through information catego- rizing, personalized advertising, product recommenda- tions, behavioural targeting and location tracking, super- computers are becoming more involved in our lives,” says Dominique Cardon, director of the MédiaLab research centre at the Institut d’études politiques de Paris. In his 2015 book, À quoi rêvent les algorithmes (what do algo- rithms dream about?), he contends that very few everyday habits, purchases, trips, or personal or professional deci- sions are not guided by a computing infrastructure. “Today, it’s so much simpler and easier to consume cultural


products, people don’t even realize that algorithms are influ- encing them anymore,” says Roberge. “But this is no reason to become paranoid and see it as a systematic intrusion into


0 1 1 0 1


1 0 1 1 0 1


0 1 1 0 1


1 0 0 0 1 0


1 1 0 1


0 1 0 0 0


1 0 1 1


0 0 1 0 1 1


0 0 1 0


0 1 1 0 1


1 0 1 1 0 1


0 0 0 1


0 1 1


0 0 1 0


0 1 1 0 1


1 0 1 1 0 1


0 1 1 0


1


our private lives.” Tapp agrees. “We shouldn’t


worry, but we do need to be vigi- lant,” he says. And with good reason: although algorithms have made it much easier to access information, we should be wary of misuse. This is especially true for internet users for whom social media — especially Facebook and its two billion users — is their


main source of information. Some of the shortcomings of algorithms came to light during the 2016 US election, when a multitude of fake news stories circulated on social media. It is even alleged that Russian-backed agents posted content to help Donald Trump get elected. “The danger of social media only showing us content based on our inter- ests is that it reinforces our opinions rather than exposing us to different points of view,” says Bélanger. “The predictive functions of algorithms are their biggest downside. The more people like something, the more of it they’ll get.” Another risk: in the US and elsewhere, police depart-


ments use algorithms to prevent crime. The best-known crime prevention program soſtware, PredPol, which was created by a US university-based startup, alerts police patrols of possible break-ins, thefts, assaults and homicides. But while these programs can help thwart terrorist


attacks, Roberge thinks they can also lead to arbitrary arrests based on racial profiling. “We should always be mindful of possible abuses,” he says. “We can’t pretend that algorithms are neutral. They’re developed by humans and can therefore be biased.” The researcher worries that the technology will evolve faster than our ability to under- stand it. Facial recognition is a clear example of this risk. Used by


Apple to unlock the new iPhone X and currently being implemented in Canada’s major airports (including Montreal, Toronto and Vancouver), the technology also raises questions about privacy and the protection of per- sonal information. “If we were able to identify anyone at any time, and provide access to biometric data, then there would be no more anonymity,” Roberge says. But let’s be clear: we are by no means living in the age of


Big Brother. And although algorithmic systems are not malevolent in and of themselves, safeguards need to be in place. “I’m not saying that algorithm giants have evil intentions,” says Roberge. “Still, these platforms contain an enormous amount of personal information that the


great dictators of the 20th century would have loved to have had at their disposal.”


PIERRE THÉROUX is a Montreal-based freelance writer FEBRUARY/MARCH 2018 | CPA MAGAZINE | 43


0 0 0 1


1 1 0


0 1 0 0


1 0 1


0 0 1 0 1


0 0 1 0


0 1 1 0 1


1 0 1 1


0 0 0


0 1


0 0 1 0


0 1 1 0 1 0 0 0 1 0 1 1 0 1 1 0 1 0 0 0 1 0 1 1


1 1 0 1 1 0 0 1 0 0 0 1 1 0 1 1 0 1 1 0 1 0 0 0 1 0 1 1


DaKuk/E+/Getty


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56  |  Page 57  |  Page 58  |  Page 59  |  Page 60  |  Page 61  |  Page 62  |  Page 63  |  Page 64