Court: Facebook must reveal role in Burma genocide


A US federal judge ordered Facebook on Sept. 22 to produce documents relating to its involvement in violence against the Rohingya people in Burma. The Gambia brought a claim against Facebook, Inc before the International Court of Justice alleging that the social media platform played a key role in the genocide of the Rohingya, a Muslim ethnic minority. The Gambia then filed suit against Facebook in the District of Columbia, under 28 USC § 1782, seeking certain documentation related to the World Court case. Facebook admitted that it failed to respond in a timely manner to concerns about its role in the Rohingya genocide. The Gambia’s case contended that it was only in 2018, six years into the genocide, that Facebook began deleting accounts and content used by Burmese government officials to enflame attacks on the Rohingya.

In court, the Gambia sought access to the deleted Facebook content. It alleged that as the genocide was occurring, Facebook continued exposing millions of people in Burma with poor digital-media literacy skills to disinformation and rumors about the Rohingya.

Facebook countered that Gambia’s request for deleted documents violated the right to privacy under the Stored Communications Act. Judge Zia Faruqui rejected this argument, stating: “Facebook taking up the mantle of privacy rights is rich with irony.” Hence, the court allowed the Gambia to access to Facebook’s “de-platformed content and related internal investigation documents,” which could prove “significant to The Gambia’s ability to prove genocidal intent.”

In April, the group Muslim Advocates also sued Facebook, alleging that the company violated the DC Consumer Protection Procedures Act by allowing fraudulent and negligent misrepresentation of Muslims on its platform. That same month, US senators introduced the Rohingya Genocide Determination Act, requiring the State Department to investigate whether the Burmese military’s “clearance operations” against the Rohingya constituted genocide.

From Jurist, Sept. 23. Used with permission.

See our last reports on the Rohingya, and Facebook’s enabling of genocide and connivance with oppressive regimes.

Photo: UNHCR

  1. Facebook accused of fanning hate

    A Facebook whistleblower has accused the social network of “literally fanning ethnic violence” in places like Burma and Ethiopia. Testifying to US senators last week, Frances Haugen, a former manager, said Facebook’s algorithm optimizes “high-engagement” content—posts that provoke the most extreme reactions, designed to keep users on the platform. She alleged the company systematically puts profit before the public good. Although Facebook has policies in place to prevent hate speech, an investigation in Burma by the rights group Global Witness found the algorithm incited violence during the military’s 2017 purge of the Rohingya minority. “What we saw in Myanmar and are now seeing in Ethiopia are only the opening chapters of a story so terrifying, no one wants to read the end of it,” Haugen told lawmakers. Her testimony follows her leak to the Wall Street Journal of scores of internal documents demonstrating how the company ignored its own research warning about the impact of its audience engagement policy on democracy, human rights, and public health. (TNH)