Facebook: We should do more to prevent violence in Myanmar
A report commissioned by Facebook comes after a UN team found that the social network played a "determining role" in crisis.
Steven MusilNight Editor / News
Steven Musil is the night news editor at CNET News. He's been hooked on tech since learning BASIC in the late '70s. When not cleaning up after his daughter and son, Steven can be found pedaling around the San Francisco Bay Area. Before joining CNET in 2000, Steven spent 10 years at various Bay Area newspapers.
ExpertiseI have more than 30 years' experience in journalism in the heart of the Silicon Valley.
Facebook hasn't always done enough to prevent its platform from spreading hate speech that's fueled deadly violence in Myanmar, according to an independent report commissioned by the company.
The report, conducted by the nonprofit Business for Social Responsibility, also offered
recommendations for helping improve human rights in the country, including stricter enforcement of content policies and regular publishing of data related to human rights violations.
"The report concludes that, prior to this year, we weren't doing enough to help prevent our platform from being used to foment division and incite offline violence," Alex Warofka, Facebook product policy manager, wrote in a blog post Monday. "We agree that we can and should do more."
The report comes amid reports of widespread genocide being committed by the military in Myanmar. In March, UN human rights experts investigating violence in the country concluded that Facebook played a "determining role" in the crisis, in which hundreds of thousands of Rohingya Muslims have fled the country.
BSR recommended Facebook improve enforcement of its community standards, which describe what is and isn't allowed on the social network. Facebook said that central to achieving this is its near-complete development of a team that understands local Myanmar issues along with policy and operations expertise.
Facebook said it's using the social-listening tool CrowdTangle to analyze potentially harmful content and understand how it spreads in Myanmar. The company is also using artificial intelligence to identify and prevent the spread of posts that contain graphic violence or dehumanizing comments.
Watch this: Apple, Facebook support more privacy laws
Preserving and sharing data that can be used to help evaluate human rights violations was also suggested, especially data specific to the situation in Myanmar so the international community can better evaluate the company's enforcement efforts.
"We are committed to working with and providing information to the relevant authorities as they investigate international human rights violations in Myanmar, and we are preserving data for this purpose," Warofka wrote, noting it took this approach with content and accounts associated with the Myanmar military it removed in August and October.
Another recommendation includes the establishment of a policy that defines Facebook's approach to content moderation with respect to human rights, a suggestion Warofka said Facebook is "looking into."
The UN's top human rights officials recommended in August that Myanmar military leaders be prosecuted for genocide against Rohingya Muslims. More than 700,000 Rohingya Muslims have fled Myanmar's Rakhine state since rebel attacks sparked a military backlash in August 2017.
UN investigators have reportedly found numerous crimes committed against the minority in Myanmar, including gang rape, enslavement, torching villages and killing children. Roughly 10,000 people have reportedly been killed in the violence, and tens of thousands have fled the country.
iHate: CNET looks at how intolerance is taking over the internet.
Solving for XX: The tech industry seeks to overcome outdated ideas about "women in tech."