search.noResults

search.searching

saml.title
dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
Winning the


Toxicity Arms Race


Mike Pappas, CEO of Modulate, explains how automated voice moderation has a natural advantage over methods of combatting abuse through text


A


utomated content moderation has, ever since its earliest incarnations, always been understood to be in an


arms race of sorts. A well-intentioned developer might build a tool to recognise certain hateful behaviors; and mere moments later, trolls and other bad actors will conceive of a way to work around it. Trying to look for certain keywords? No


problem, we’ll just replace the letter e with 3, o with 0, etc. Oh, you figured out common replacements?


We’ll get fancier, then, and use more obscure replacement characters, like the greek letter ο (that’s an omicron, not the letter ‘o’ - totally different to a computer program, but still completely readable to a person.) Text filters figure that one out, too? Well, we can just get fancier by strategically misspelling


28 | MCV/DEVELOP March 2022


words; adding spaces or punctuation marks between the letters, or other fancy techniques. The best text moderation tools today handle


all of these special cases and more… yet there are few forces more powerful than the ingenuity of people, even when the people in question are just really determined to type a racial slur. As such, any platform implementing text moderation needs to constantly be on watch for the next shot fired in this eternal war, and invest heavily to evolve their tools over time as new evasion techniques are developed. Given this situation, it’s understandable that


most people assume all content moderation efforts will be caught in this sort of perpetual cycle. Indeed, image and video moderation tends to suffer from the same challenges, as computers interpret pixels quite differently from the human eye, leaving plenty of room for the same sorts


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56  |  Page 57  |  Page 58  |  Page 59  |  Page 60  |  Page 61  |  Page 62  |  Page 63  |  Page 64