Facebook Admits It Was Used to Incite Violence in Myanmar


Facebook has long promoted itself as a tool for bringing people together to make the world a better place. Now the social media giant has acknowledged that in Myanmar it did the opposite, and human rights groups say it has a lot of work to do to fix that.

Facebook failed to prevent its platform from being used to “foment division and incite offline violence” in the country, one of its executives said in a post on Monday, citing a human rights report commissioned by the company.

“We agree that we can and should do more,” the executive, Alex Warofka, a Facebook product policy manager, wrote. He also said Facebook would invest resources in addressing the abuse of its platform in Myanmar that the report outlines.

The report, by Business for Social Responsibility, or BSR, which is based in San Francisco, paints a picture of a company that was unaware of its own potential for doing harm and did little to figure out the facts on the ground.

The report details how Facebook unwittingly entered a country new to the digital era and still emerging from decades of censorship, all the while plagued by political and social divisions.

But the report fails to look closely at how Facebook employees missed a crescendo of posts and misinformation that helped to fuel modern ethnic cleansing in Myanmar.

The report recommends that Facebook increase enforcement of policies for content posted on its platform; exercise greater transparency with data that shows its progress; and engage with civil society and officials in Myanmar.

Some Facebook detractors criticized the company on Tuesday for releasing the report on the eve of the midterm elections in the United States, when the attention of the news media and many of Facebook’s most vocal critics was elsewhere. Human rights groups said Facebook’s pledge needed to be followed up with more concrete actions.

“There are a lot of people at Facebook who have known for a long time that the company should have done more to prevent the gross misuse of its platform in Myanmar,” said Matthew Smith of Fortify Rights, a nonprofit human rights organization that focuses on Southeast Asia.

“This assessment is encouraging and overdue, but the key to any assessment is implementation,” Mr. Smith added.

Phil Robertson, deputy Asia director for Human Rights Watch, said Facebook’s actions in Myanmar would be “the acid test” to see if it became “a responsible platform manager with its own enforceable code of conduct.”

In response to a question about the timing of the release of the report, Facebook said it had previously committed to publishing the report at this time. It also said the report was in line with the company’s commitment to respond to growing concerns in Myanmar.

Two years after the 2016 election in the United States put the company under heightened scrutiny, Facebook has introduced a series of experiments meant to address the problem of misinformation on its platform. It has also set up fact-checking groups and altered its advertising practices to bar those who seek to spread false news.

In Myanmar, Facebook essentially is the internet — and, by extension, the only source of information — for some 20 million people, according to BSR’s estimates. Mobile phones sold there already have Facebook installed.

As Facebook’s presence in Myanmar grew in recent years, the company did not address what the BSR report calls a “crisis of digital literacy” in a country that was just emerging from a military dictatorship and where the internet was still new.

Many citizens in Myanmar, the report says, still do not know the basics of the internet — from using a browser to setting up an email account — and are not equipped to distinguish real information from rumor. The report warns that this could continue to be a problem for Facebook, especially during the country’s general elections in 2020.

New problems could also arise related to WhatsApp, the messaging app owned by Facebook that is becoming popular in Myanmar.

WhatsApp has begun to play a leading role in elections, particularly in developing countries where it is being used by political parties, religious activists and others to spread information. In India’s recent elections, some WhatsApp messages were used to incite tensions while others were found to be false.

In Myanmar, the prevalence of hate speech, disinformation and bad actors on Facebook “has had a negative impact on freedom of expression, assembly and association for Myanmar’s most vulnerable users,” the report says. This has led to the suppression of free speech; violence and hate campaigns; and self-censorship by women, minorities and other vulnerable members of society.

Myanmar military officials were behind a systematic campaign on Facebook to target a mostly Muslim Rohingya minority, an investigation by The New York Times found. Human rights groups say this campaign has led to murder, rape and forced migration.

Facebook took down the official accounts of military leaders in August. But some activists said the company still had not done enough.

“I don’t think there will be a significant change,” said Ye Wai Phyo Aung, the founder of Athan, a free-speech organization. Facebook should do more to prevent fake accounts from being created, he said.

“It’s still easy to sign up for a Facebook account with fake names,” he said. “The Facebook team should have worked on that.”

Not everyone in Myanmar agreed with the blame placed on Facebook. Ye Myat Thu, the managing director of Alpha Computer Mandalay, an information technology firm, said the government needed to take action to enforce better behavior online and prevent violence.

“Facebook is just the platform,” he said. “If there is no Facebook, people will use another platform. I see Facebook just as a product.”

Source : Nytimes