- Mark Zuckerberg and Facebook have been criticised by Myanmar organisations amid a suspected genocide in the country.
- The Facebook CEO has been accused of not doing enough to crack down on hate speech and direct incitements to violence in Myanmar.
- Hundreds of thousands of Rohingya Muslims have had to flee their homes.
- The UN has previously said Facebook has “substantively contributed to the level of acrimony and dissension and conflict.”
Civil society groups in Myanmar have written a open letter to Mark Zuckerberg criticising Facebook’s response to the spread of hate speech that incites violence amid the suspected genocide of Rohingya Muslims in the country, calling the social network’s response “inadequate.”
Myanmar-based innovation lab Phandeeyar, the Center for Social Integrity, the Myanmar Human Rights Education Network and others wrote that Facebook’s response has “an over-reliance on third parties, a lack of a proper mechanism for emergency escalation, a reticence to engage local stakeholders around systemic solutions and a lack of transparency.”
They added: “The risk of Facebook content sparking open violence is arguably nowhere higher right now than in Myanmar.”
Nearly 700,000 Rohingya Muslims have fled from the country over the last year in the face of the killings of thousands, including children. The UN’s human rights chief has said he strongly suspects “acts of genocide” have occurred, with reports indicating a “deliberate attempt by the authorities to destroy evidence of potential international crimes, including possible crimes against humanity.”
Analysts and civil society organisations on the group have said that Facebook is being used to spread anti-Rohingya sentiment, with one analysis showing a surge in hate speech being spread on the platform at the start of the crisis.
“Facebook definitely helped certain elements of society to determine the narrative of the conflict in Myanmar,” analyst Raymond Serrato previously told the Guardian. “Although Facebook had been used in the past to spread hate speech and misinformation, it took on greater potency after the attacks.”
Facebook says the problem is getting “a lot of focus” — but Myanmar groups says it’s not enough
In an interview with Vox at the start of April, Zuckerberg was asked about the Facebook’s role in Myanmar and whether it was helping to spread propaganda that contributes to ethnic cleansing.
The CEO responded that the “Myanmar issues” are getting “a lot of focus inside the company,” and cited an example of how Facebook had detected “sensational messages” being spread through Facebook Messenger, which it then prevented spreading.
Here’s the key passage — emphasis added:
“The Myanmar issues have, I think, gotten a lot of focus inside the company. I remember, one Saturday morning, I got a phone call and we detected that people were trying to spread sensational messages through — it was Facebook Messenger in this case — to each side of the conflict, basically telling the Muslims, “Hey, there’s about to be an uprising of the Buddhists, so make sure that you are armed and go to this place.” And then the same thing on the other side.
“So that’s the kind of thing where I think it is clear that people were trying to use our tools in order to incite real harm. Now, in that case, our systems detect that that’s going on. We stop those messages from going through. But this is certainly something that we’re paying a lot of attention to.
In Thursday’s open letter however, the Myanmar organisations said that this example actually highlighted the flaws in Facebook’s approach.
“Far from being stopped, [the messages] spread in an unprecedented way, reaching country-wide and causing widespread fear and at least three violent incidents in the process,” they wrote.
They believe their organisations were the unspecified “systems” that detected the messages, and wrote they were only able to reach Facebook four days after the messages started spreading, “with thousands, if not hundreds of thousands [of message recipients] being reached in the meantime.”
Facebook has since stated that it uses technology to automatically scan Messenger conversations among uses to detect problematic content, though it’s not clear why the messages the groups refer to were not picked up sooner by this system.
‘Facebook has rapidly come to play a dominant role in how information is accessed and communicated’
The letter also critiques the fact there are, to the groups’ knowledge, no Burmese-speaking Facebook people working at Facebook, and says there are no Facebook employees working in the country.
Facebook has not gotten back to the organisations about “many of the issues” and suggestions they raised in December 2017, they said, and called on Facebook to invest more in moderation.
“The risk of Facebook content sparking open violence is arguably nowhere higher right now than in Myanmar. We appreciate that progress is an iterative process and that it will require more than this letter for Facebook to fix these issues,” the letter says.
“If you are serious about making Facebook better, however, we urge you to invest more into moderation – particularly in countries, such as Myanmar, where Facebook has rapidly come to play a dominant role in how information is accessed and communicated; We urge you to be more intent and proactive in engaging local groups, such as ours, who are invested in finding solutions, and – perhaps most importantly – we urge you to be more transparent about your processes, progress and the performance of your interventions, so as to enable us to work more effectively together.”
The UN has also criticised Facebook’s role in the spread of hate speech in the country, with one official saying in March: “It has … substantively contributed to the level of acrimony and dissension and conflict, if you will, within the public. Hate speech is certainly of course a part of that. As far as the Myanmar situation is concerned, social media is Facebook, and Facebook is social media.”
Facebook did not respond to Business Insider’s request for comment.
The full letter from the Myanmar organisations is below: