Julia Carrie Wong, writing for The Guardian:
The social network is now officially available in 111 languages. The rules that govern what users can or cannot post on the site – on crucial issues ranging from hate speech and incitement, to violence to health misinformation and self-harm – had only been translated into 41 languages as of April 2019, according to an investigation by Reuters. The company’s 15,000 content moderators were fluent in just about 50 languages, and the company’s much-vaunted automated tools that are supposed to detect dangerous content only worked in about 30.
Facebook is choosing growth over safety.
Nothing says “we care about moderation” like expanding into an area, translating your site, but not bothering to translate your moderation policy or having moderators on hand who can read the languages being used on your network.
Two quick reminders, drawn from Wong’s article:
First, Facebook has been implicated in ethnic cleansing in Myanmar. And the anti-Muslim riots in Sri Lanka. In both cases, language barriers were a problem.
And second, Facebook has seen $69 billion in net profit since 2010.
Facebook has expanded into areas – some of which are volatile – that it doesn’t understand without providing the tools needed to moderate whatever’s posted.
This isn’t just an oversight. It’s a choice. It’s an denial of responsibility. And it’s a failure to protect people.