UCA News
Benedict Rogers

Facebook's link to Rohingya genocide

Will parent company, Meta, choose to be an ‘accomplice to genocide’ rather than a ‘force for good’?
Published: October 03, 2022 04:01 AM GMT

Updated: November 02, 2022 11:25 AM GMT

I use Facebook a lot. I use it as a source of information, and to disseminate information and ideas

Rohingya refugees listen to a speaker during a "Genocide Remembrance Day" rally marking the 5th anniversary since fleeing Myanmar from a military offensive, at a refugee camp in Ukhia, Bangladesh, on Aug 25. (Photo: AFP)

I use Facebook a lot. I use it as a source of information, and to disseminate information and ideas. I also use it, occasionally, for lighter moments, to share photographs of beautiful scenery, spiritual inspiration and moments with family and friends. Facebook can be a force for good, if we use it well.

But a new, powerful and important report by Amnesty International titled The Social Atrocity: Meta and the right to remedy for the Rohingya reveals the dark side of Facebook.  

The Amnesty report reveals that Facebook’s owner, Meta, knew — or at least should have known — that its algorithmic systems were “supercharging the spread of harmful anti-Rohingya content in Myanmar,” and yet the company failed to act. In other words, Meta’s focus on profit through its use of algorithms “substantially contributed to the atrocities perpetrated by the Myanmar regime against the Rohingya people in 2017.”

Those atrocities have been recognized by the US government and others as genocide and crimes against humanity. The International Court of Justice has agreed to consider a case of genocide against the Rohingya within its jurisdiction.

Sawyeddollah, a 21-year-old Rohingya refugee, told Amnesty International: “I saw a lot of horrible things on Facebook. And I just thought that the people who posted that were bad… Then I realized that it is not only these people — the posters — but Facebook is also responsible. Facebook is helping them by not taking care of their platform.”

In principle, I am a defender of free speech and an opponent of censorship. I dislike the so-called “cancel culture” or “de-platforming” that has become a phenomenon, particularly in Western democracies, where people are hounded out of the public square, online and offline, because they express views with which some others might disagree. I worry about this growing trend. It is anti-democratic and risks undermining our civil liberties.

"if Amnesty’s evidence is correct, Facebook not only failed to take down posts that spread hatred of the Rohingyas — they actively promoted them"

But that is not what we are talking about here. There is a big difference between, for example, feminist Germaine Greer or novelist JK Rowling being “de-platformed” by over-zealous intolerant proponents of a new gender ideology because these two women have challenged the transgender agenda, and the misuse of social media to stir racial and religious hatred on such a scale that it results in genocide.

And it seems to me that too often Facebook joins the “cancel culture” in the West by taking down posts that might not be “woke” or “politically correct” while being complicit with posts that actively supported, encouraged and even incited rape, torture and murder. Facebook needs, at the very least, to reconsider its priorities.

For if Amnesty’s evidence is correct, Facebook not only failed to take down posts that spread hatred of the Rohingyas — they actively promoted them. Through algorithms, such posts were boosted.

“Meta uses engagement-based algorithmic systems to power Facebook’s news feed, ranking, recommendation and group features, shaping what is seen on the platform. Meta profits when Facebook users stay on the platform as long as possible, by selling more targeted advertising. The display of inflammatory content — including that which advocates hatred, constituting incitement to violence, hostility and discrimination — is an effective way of keeping people on the platform longer. As such, the promotion and amplification of this type of content is key to the surveillance-based business model of Facebook,” according to Amnesty.  

Facebook in Myanmar, in the months and years prior to the 2017 military campaign against the Rohingya, became, in Amnesty’s words, “an echo chamber of anti-Rohingya content.”

Knowing Myanmar well, this does not surprise me on one level. Facebook is used in Myanmar like a news channel — indeed, for many people, it is their only source of information. With the rapid rise of affordable mobile phone technology for people across Myanmar in the past decade, Facebook opened up a new world of communications and information. While that could have been a force for liberalization and democratization and the spread of good ideas, instead — for Myanmar — it became an agent of genocide.

"Even content from the very top levels of Myanmar’s military inciting violence and discrimination was posted unchallenged"

As Amnesty notes: “Actors linked to the Myanmar military and radical Buddhist nationalist groups flooded the platform with anti-Muslim content, posting disinformation claiming there was going to be an impending Muslim takeover, and portraying the Rohingya as ‘invaders’.”

The UN’s Independent International Fact-Finding Mission on Myanmar ultimately concluded that the “role of social media [was] significant” in the atrocities in a country where “Facebook is the internet.”

One post that was shared more than 1,000 times, pictured a Muslim human rights defender described as a “national traitor.” The comments left on the post included threatening and racist messages, including ‘He is a Muslim. Muslims are dogs and need to be shot’, and ‘Don’t leave him alive. Remove his whole race. Time is ticking.’

Even content from the very top levels of Myanmar’s military inciting violence and discrimination was posted unchallenged. Senior General Min Aung Hlaing, the leader of Myanmar’s military, posted on his Facebook page in 2017: “We openly declare that absolutely, our country has no Rohingya race.” He went on to seize power in a coup in February 2021.

Meta repeatedly failed to conduct appropriate human rights due diligence on its operations in Myanmar, despite its responsibility under international standards to do so, according to Amnesty’s report.

Since 2012, internal studies indicated that Meta knew its algorithms could result in serious dangerous consequences, and in 2016, Meta’s own research clearly acknowledged that “our recommendation systems grow the problem” of extremism.

"Facebook can still be a force for good if it changes its model, reviews its systems"

Civil society made numerous representations to Meta over the course of five years, from 2012 to 2017, warning of the dangers of extreme violence in Myanmar. Yet Meta failed to act. Amnesty has written to Meta several times in 2022 but received no response.

Now, Amnesty International is campaigning for Meta to pay reparations. One way Meta could redeem — to some extent — its reputation and its conscience, if it has one, is to support a US$1 million education project in the refugee camps in Bangladesh. Such funding would help Rohingyas and would represent just 0.002 percent of Meta’s 20201 profits of $46.7 billion. Yet Meta has rejected this request, stating: “Facebook doesn’t directly engage in philanthropic activities.” Well, maybe it should start.

Facebook can still be a force for good if it changes its model, reviews its systems, puts certain principles ahead of profit and stops using algorithms to fire up hatred that fuels crimes against humanity, genocide and other atrocity crimes. Meta should respond to the appeals of Rohingyas and wider civil society.

If it fails to do so, then according to Showkutara, a 22-year-old Rohingya woman and youth activist, “we will go to every court in the world.” Does Meta really want to live with the charge of “accomplice to genocide” rather than “force for good”?

One last thought — I don’t know if Mark Zuckerberg and Meta realize this, but if you add an extra ‘t’ to the company’s name it would be Metta — which in Burmese Buddhist terminology means “loving-kindness.” It would surely be better to live up to that principle than to be an agent of genocide.

*Benedict Rogers is a human rights activist and writer. He is Senior Analyst for East Asia at the international human rights organization CSW, the co-founder and Chief Executive of Hong Kong Watch, co-founder and Deputy Chair of the UK Conservative Party Human Rights Commission, a member of the advisory group of the Inter-Parliamentary Alliance on China (IPAC) and a board member of the Stop Uyghur Genocide Campaign. He is the author of seven books, including three books about Myanmar, especially his latest, “Burma: A Nation at the Crossroads”, and his faith journey is told in his book “From Burma to Rome: A Journey into the Catholic Church” (Gracewing, 2015). His new book, “The China Nexus: Thirty Years In and Around the Chinese Communist Party’s Tyranny”, will be published in October 2022 by Optimum Publishing International. The views expressed in this article are those of the author and do not necessarily reflect the official editorial position of UCA News.

Help us keep UCA News independent
The Church in Asia needs objective and independent journalism to speak the truth about the Church and the state. With a network of professionally qualified journalists and editors across Asia, UCA News is all about this mission.
Why we need your help?
UCA News needs your help to become the voice of the voiceless by producing interesting series, daily news reports, podcasts, spiritually enriching commentaries, and features. We are committed to make a difference in the society and serve those in need. Help us in this media mission.

Also Read

UCA News Catholic Dioceses in Asia
UCA News Catholic Dioceses in Asia
UCA News Catholic Dioceses in Asia