industryopinion
How legislation is raising the stakes for online platforms
Chris Downie, CEO at Pasabi discusses how forthcoming legislation will raise the stakes for online platforms.
W
inston Churchill famously said, “Te price of greatness is responsibility”. In the digital world, where more than half the globe’s population now uses a variety of online platforms,
these sites have been enjoying great success. Tirteen new users globally start using social media for the first time every second; that’s a lot of responsibility resting on platforms’ shoulders. Historically, however, platforms haven’t felt this weight of
responsibility. Tey haven’t been liable for illegal user-generated content they host that they don’t know about. Tey only became liable if their technology identified it, or were made aware of it by users, and failed to remove it. Unsurprisingly, with no incentive for these sites to proactively police their user-generated content, they’ve gambled users’ trust and safety for growth. Te stakes are about to change as platforms
will be made more accountable for their content with the forthcoming UK Online Safety Bill and EU Digital Services Act (DSA).
moderation techniques to block or remove content that promotes suicide, incites violence or constitutes cyberbullying.
“Platforms operating in the UK,
Ofcom appointed as UK regulator Ofcom, the UK communications regulator, will be empowered to enforce the new regulations. Companies in scope will need to provide annual transparency reports indicating the prevalence of harmful content on their platforms and the measures they‘re taking to tackle it. Tese reports will also be published online for users to view. Additionally, Ofcom will be able to request information regarding the use of tools for proactively finding, flagging, blocking and removing harmful or illegal content. Platforms operating in the UK, regardless of
regardless of physical location,
will need to comply or face fines of up to £18 million or 10% of a
Increased Internet usage means increased risk As consumers’ usage of online marketplaces and other websites increases, these platforms should be aware of the harms to which they can potentially expose their users - hate speech, revenge pornography and terrorist content, among others. In the wrong hands, online platforms can support criminal activity or abuse of its users. In the UK, two-thirds of adults are concerned about viewing harmful content online and almost half report seeing hateful content over the past year. As a result, the UK government has draſted the Online Safety Bill to
the Digital Services Act to protect consumers from illegal products and content online. Te new UK legislation will make companies that offer user-to-
user services (e.g. social media sites, discussion forums, messaging services) accountable for protecting children and other vulnerable users. Tey will have to make it easier for consumers to report harmful content and have it removed. Platforms will need robust
14 | April 2022
provider’s annual global revenue. Non-compliant providers could
receive court orders to disrupt or prevent access to their services.”
content. MEPs want clear, EU-wide rules for content moderation. Tey’re
looking to apply a ‘notice and action’ mechanism which should: • Be effective - channels for users to flag illegal content and platforms to swiſtly act
create a new regulatory framework for online safety to define companies’ responsibilities in keeping UK users, especially children, safer online. Similar concerns have been addressed in the EU with the creation of
• Not be abused - users should be notified if content is flagged/ removed with ability to appeal
• Respect users’ rights - freedom of expression to ensure only illegal content is removed and in a non-discriminatory manner
Innovation in technology has been happening so quickly that regulators thus far haven’t been able to keep up with it. Te advent of the DSA and Online Safety Bill, however, marks a new era in digital regulation. It will no longer be possible, or acceptable, for platforms to ignore the presence of illegal content and products. Consumers are looking for change and regulators are finally starting to take proper notice and action.
www.pcr-online.biz
physical location, will need to comply or face fines of up to £18 million or 10% of a provider’s annual global revenue. Non-compliant providers could receive court orders to disrupt or prevent access to their services. In the EU, MEPs have given the go-ahead for member states to negotiate the details of the DSA regarding moderating illegal products and content. Consumers have long been demanding a safer online experience, whilst brands have been looking for platforms to take more responsibility for their third party products and
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50 |
Page 51 |
Page 52