The Washington PostDemocracy Dies in Darkness

Opinion Facebook must do more to prevent 21st-century genocide

|
October 28, 2018 at 7:07 p.m. EDT
The Facebook app icon on a smartphone. (Patrick Sison/AP)

IN MYANMAR, Facebook is more than a website. For many residents, it is the entire Internet. So when the nation’s military used the site as a conduit for a campaign against Muslims, there were no guardrails to stop the hatred from spreading — except the company itself. The grim reality on the ground in Myanmar, from which more than 700,000 members of the Rohingya minority have now fled, shows Facebook failed.

The horrifying tale of how a social media site turned into a tool for ethnic cleansing was reported by the New York Times last week. In a coordinated campaign, members of the military posed as pages for celebrities, from war veterans to beauty queens, to amplify tensions between Myanmar’s Buddhists and the Rohingya. The military, the postings suggested, was the only bulwark against terrorist attacks by murderous Islamists.

There are 18 million Internet users in Myanmar. The pages that Facebook removed after the Times tipped it off had 1.35 million followers. Pages the platform removed in August had 12 million followers.

Facebook did too little to confront this hatred for too long. The campaign began half a decade ago, but until recently, Facebook responded to reports of hate speech only as they arrived individually. In 2013, just three of Facebook’s content reviewers spoke Myanmar. Now, 60 do — and Facebook plans for more than 100 by the end of the year. Facebook has also formed a team dedicated to Myanmar, commissioned a human rights impact assessment and pledged to improve artificial intelligence to identify rule violations as well as streamline the reporting process.

These are positive steps, but Facebook must do more to prevent the next 21st-century genocide. Myanmar is far from the only country where Facebook has the footprint of a colossus and where cultural tensions are primed for exploitation. Those who wish to mobilize the citizenry against a particular group no longer need to print a paper, or transmit over the radio, or even set up a website. The website is there already, and anyone can be a broadcaster.

To answer this reality, Facebook must proactively monitor its impact in the countries where it is most powerful. Built-out teams in sensitive areas, with members who speak the local language, should search for patterns that researchers warn precede genocide. Engineers should also partner more closely with civil society to find and remove technical flaws in reporting systems and elsewhere. And Facebook needs to focus on bad actors as much as bad content. The company has already updated its policies against misinformation that could contribute to credible violence; it should also adjust its rules against “dangerous individuals” to include officials and others promoting ethnic cleansing and root out those accounts without prompting.

Facebook has always wanted to “connect the world,” and through free-data deals with phone operators in developing countries, it has ensured that much of that connection happens on its platform. If Facebook wants these citizens to invest in its services, it needs to invest in their safety, too.

Read more:

The Post’s View: What is happening in Myanmar is genocide. Call it by its name.

Christian Caryl: Forget Cambridge Analytica. What about Facebook’s role in ethnic strife and genocide?

The Post’s View: Facebook’s other fake-news problem

Megan McArdle: Facebook is America’s scapegoat du jour

Donald E. Graham: Don’t regulate Facebook