Rohingya refugees sued Facebook on December 6 for $150 billion over claims that the social network is failing to stem hate speech on its platform, exacerbating violence against the vulnerable minority.
The complaint, lodged in a California court, says the algorithms that power the US-based company promote disinformation and extremist thought that translates to real-world violence.
“Facebook is like a robot programmed with a singular mission: to grow,” the court document states.
“The undeniable reality is that Facebook’s growth, fuelled by hate, division, and misinformation, has left hundreds of thousands of devastated Rohingya lives in its wake.”
The mainly Muslim group faces widespread discrimination in Myanmar, where they are despised as interlopers despite having lived in the country for generations.
A military-backed campaign that the UN said amounted to “genocide” saw hundreds of thousands of Rohingya driven across the border into Bangladesh in 2017, where they have since lived in sprawling refugee camps.
Many others remain in Myanmar, where they are not permitted citizenship and are subject to communal violence, as well as official discrimination by the military-backed State Administration Council.
The legal complaint argues that Facebook’s algorithms drive susceptible users to join ever-more extreme groups, a situation that is “open to exploitation by autocratic politicians and regimes”.
Rights groups have long charged that Facebook does not do enough to prevent the spread of disinformation and misinformation online.
Critics say even when alerted to hate speech on its platform, the company fails to act.
They charge that the social media giant allows falsehoods to proliferate, affecting the lives of minorities and skewing elections in democracies such as the US, where unfounded charges of fraud circulate and intensify among like-minded friends.
This year, a huge leak by a company insider sparked articles arguing Facebook, whose parent company is now called Meta, knew its sites could harm some of their billions of users – but executives chose growth over safety.
Whistleblower Frances Haugen told the US Congress in October that Facebook is “fanning ethnic violence” in some countries.
Under US law, Facebook is largely protected from liability over content posted by its users.
The Rohingya lawsuit, anticipating this defence, argues that where applicable, the law of Myanmar – which has no such protections – should prevail in the case.
Facebook, which did not immediately respond to questions about the lawsuit, has been under pressure in the US and Europe to clamp down on false information, particularly over elections and the coronavirus.
The company has forged partnerships with several media companies intended to verify online posts and remove those that are untrue.