27 April 2018

Zuckerberg Was Called Out Over Myanmar Violence. Here’s His Apology.

By KEVIN ROOSE and PAUL MOZUR

Rohingya refugees arriving in Bangladesh last year from Myanmar, where advocacy groups have criticized Facebook’s approach to hate speech. In an email, Mark Zuckerberg, Facebook’s chief executive, told the groups: “I apologize for not being sufficiently clear about the important role that your organizations play in helping us understand and respond to Myanmar-related issues.” Last week, after frustrated activists from Myanmar sent an open letter to Mark Zuckerberg, the chief executive of Facebook, they got something unexpected: a reply. The activists, representing six civil society organizations, harshly criticized Mr. Zuckerberg in the letter, saying he had mischaracterized Facebook’s response to violence-inciting messages in Myanmar and had not devoted sufficient resources to enforcing its hate speech rules in the violence-stricken country. Mr. Zuckerberg wrote back to the group the next day from his personal email address, apologizing for misspeaking and outlining steps that Facebook was taking to increase its moderation efforts.


Mr. Zuckerberg’s email, which was provided to The New York Times by the activist groups, was the chief executive’s first direct communication with the local groups that have criticized Facebook’s role in the country’s growing humanitarian crisis. Facebook has been accused by United Nations investigators and human rights groups of facilitating violence against Rohingya Muslims, a minority ethnic group, by allowing anti-Muslim hate speech and false news to spread on its platform.

Facebook is a dominant source of information in Myanmar, and civil society groups have accused it of being a kind of absentee landlord, with few moderators and systems in place to keep extremists from using Facebook posts to incite violence.

In his email, Mr. Zuckerberg said Facebook had added “dozens” of Burmese language content reviewers to monitor reports of hate speech and had “increased the number of people across the company on Myanmar-related issues,” including a product team working on building tools to try to help stem the violence there.

The disagreement centers on a chain letter that spread on Facebook Messenger in Myanmar in September. The messages warned Buddhist communities of an imminent Muslim attack. Meanwhile, Muslim populations received a separate message cautioning them of violence from militant Buddhist groups.

Civil society groups say the messages paralyzed major cities in Myanmar and raised fears of a violent clash. Such incitement and scaremongering have become far too typical on Facebook, according to the groups, which say Facebook has repeatedly failed to follow through on promises to devote more resources to the issues.

In an interview last week, Mr. Zuckerberg appeared to hold up the September episode as a model of Facebook’s effectiveness, and said the company’s systems had detected the messages and stopped them. In fact, the activists said, they flagged the messages repeatedly to Facebook, barraging its employees with strongly worded appeals until the company finally stepped in to help.


Rohingya refugees reaching Malaysia this month. Civil society groups in Myanmar have said Facebook doesn’t do enough to keep extremists from using posts to incite violence. CreditAgence France-Presse — Getty Images

Mr. Zuckerberg’s personal email did not quell the activists’ frustration. The groups say the biggest obstruction to their attempts to push back against a torrent of dangerous hate speech is not their lack of resources but Facebook itself. They said Facebook had a history of pledging to do more to help quell ethnic violence in Myanmar but had not fulfilled its promises.

“It’s great that he’s engaging personally with this, but the stuff he’s talking about is really not that much different from what they’ve been saying for the past few years,” said Jes Petersen, the chief executive of Phandeeyar, an innovation lab in Myanmar that has worked with Facebook to produce localized versions of its community standards.

A Facebook spokeswoman, Debbie Frost, confirmed the authenticity of Mr. Zuckerberg’s email, and said Facebook was planning to continue engaging with the activists.

Our columnist Andrew Ross Sorkin and his Times colleagues help you make sense of major business and policy headlines — and the power-brokers who shape them.

Years after civil society groups first began flagging hate speech in Myanmar, the company still has no permanent office or staff in the country and seems to be struggling to give its platform sufficient oversight. In Germany, where hate speech laws require vigilant attention from content reviewers, Facebook has hired about 1,200 moderators. In order to achieve the same ratio of users to moderators in Myanmar, Facebook would need to have around 800 reviewers in the country, Mr. Petersen calculated.

“Dozens of content reviewers is not going to cut it,” he said.

The civil society groups have already responded to Mr. Zuckerberg’s reply, asking for hard data about Facebook’s efforts in the region, including how many Burmese-speaking reviewers the company has, how many accounts the company has taken down in Myanmar and how long, on average, it takes for Facebook to respond to reports of hate speech.

“A lot of what they’ve been doing is cosmetic — it’s not the tangible improvement we’re looking for,” said Victoire Rio, a social media analyst in Myanmar who was named in Mr. Zuckerberg’s reply.

Activists in other developing countries have raised similar complaints about Facebook’s behavior. In Indonesia, politicians have called Facebook executives to account for the spread of disinformation. In the Philippines, critics of President Rodrigo Duterte have faced barrages of threatening posts. Last month, the government of Sri Lanka ordered Facebook blockedin an attempt to stem mob violence against Muslim communities.

Last month, Adam Mosseri, Facebook’s News Feed head, said in an interview that he and other Facebook executives “lose some sleep” over the possibility that Facebook had led to real-world violence.

Mr. Petersen said he hoped Mr. Zuckerberg’s appeal would spur actual change and not just expressions of worry. “I wonder how he spent those sleepless nights — because we didn’t see that much change,” he said.

Here is the full text of Mr. Zuckerberg’s email to the civil society groups, followed by the groups’ response:

I wanted to personally respond to your open letter. Thank you for writing it and I apologize for not being sufficiently clear about the important role that your organizations play in helping us understand and respond to Myanmar-related issues, including the September incident you referred to.

In making my remarks, my intention was to highlight how we’re building artificial intelligence to help us better identify abusive, hateful or false content even before it is flagged by our community.

These improvements in technology and tools are the kinds of solutions that your organizations have called on us to implement and we are committed to doing even more. For example, we are rolling out improvements to our reporting mechanism in Messenger to make it easier to find and simpler for people to report conversations.

In addition to improving our technology and tools, we have added dozens more Burmese language reviewers to handle reports from users across all our services. We have also increased the number of people across the company on Myanmar-related issues and we now we have a special product team working to better understand the specific local challenges and build the right tools to help keep people there safe.

There are several other improvements we have made or are making, and I have directed my teams to ensure we are doing all we can to get your feedback and keep you informed.

We are grateful for your support as we map out our ongoing work in Myanmar, and we are committed to working with you to find more ways to be responsive to these important issues.

Mark

The civil society groups’ response. The half-dozen signatories of the response include Phandeeyar, a leading technology hub in the country; the Myanmar ICT for Development Organization, which monitors online hate speech; and the Center for Social Integrity.

Dear Mark,

Thank you for responding to our letter from your personal email account. It means a lot.

We also appreciate your reiteration of the steps Facebook has taken and intends to take to improve your performance in Myanmar.

This doesn’t change our core belief that your proposed improvements are nowhere near enough to ensure that Myanmar users are provided with the same standards of care as users in the U.S. or Europe.

When things go wrong in Myanmar, the consequences can be really serious — potentially disastrous. You have yourself publicly acknowledged the risk of the platform being abused towards real harm.

Like many discussions we have had with your policy team previously, your email focuses on inputs. We care about performance, progress and positive outcomes.

In the spirit of transparency, we would greatly appreciate if you could provide us with the following indicators, starting with the month of March 2018:

■ How many reports of abuse have you received?

■ What % of reported abuses did your team ultimately remove due to violations of the community standards?

■ How many accounts were behind flagging the reports received?

■ What was the average time it took for your review team to provide a final response to users of the reports they have raised? What % of the reports received took more than 48 hours to receive a review?

■ Do you have a target for review times? Data from our own monitoring suggests that you might have an internal standard for review — with most reported posts being reviewed shortly after the 48 hrs mark. Is this accurate?

■ How many fake accounts did you identify and remove?

■ How many accounts did you subject to a temporary ban? How many did you ban from the platform?

Improved performance comes with investments and we would also like to ask for more clarifications around these. Most importantly, we would like to know:

■ How many Myanmar speaking reviewers did you have, in total, as of March 2018? How many do you expect to have by the end of the year? We are specifically interested in reviewers working on the Facebook service and looking for full-time equivalents figure.

■ What mechanisms do you have in place for stopping repeat offenders in Myanmar? We know for a fact that fake accounts remain a key issue and that individuals who were found to violate the community standards on a number of occasions continue to have a presence on the platform.

■ What steps have you taken to date to address the duplicate posts issue we raised in the briefing we provided your team in December 2017?

We’re enclosing our December briefing for your reference, as it further elaborates on the challenges we have been trying to work through with Facebook.

No comments: