Inspired violence
In the second week of October 2021, Durga Puja, the most important celebration in the Bengali Hindu calendar, was in full swing across Bangladesh.
Right in the middle of the festival — which continues for a week and is meticulously planned by the Hindu community — a Facebook post appeared showing the Quran, the holy book of Muslims, at the feet of a Hindu deity at a temple in Comilla, 84 kilometers southeast of the capital Dhaka.
The genesis of the post was sketchy at best. But the damage was done.
The incident sparked a series of violent protests across the country. Initial reports suggest that around 500 people gathered in Comilla and adjoining areas and attacked Hindu temples and shrines after the viral post.
The violence, one of the worst against Hindu minorities in recent years, lasted for several days. More than 80 make-shift shrines, called Mandaps, were vandalized, killing at least seven people, including two Hindus, and leaving 150 wounded. Police used teargas and even opened fire at protestors in one incident.
Later, police identified a Muslim man with a history of drug addiction and petty crime behind the act.
Sparked by what was evidently a fake Facebook post, the Durga Puja violence encapsulates a dangerous trend that has taken hold in Bangladesh, where social media platforms are used to instigate crowds to attack vulnerable communities.
The country was again rocked by a spate of religious violence in the second week of July this year (2022) when a livid Muslim mob ransacked several Hindu homes, and at least one local temple after a Hindu college student uploaded a post on Facebook “belittling Islam.”
The violence occurred in Narial, a rural district in the southwestern part of the country, 150 km from the capital Dhaka. Several Hindu homes were destroyed, and dozens of residents were injured. According to local media, the police did not arrest any preparators but registered a case against the student for hurting religious sentiments.
Religion, smartphones, hatred
The government of Bangladesh is grappling with this new menace without much success as such recurrent bouts of violence continue to happen. The template is almost invariably the same: a social media platform, specifically Facebook, is used to amplify hate content and help attackers consolidate strength on the ground.
Bangladesh, a Muslim majority country of 164 million, was carved on the world map in December 1971 following what is widely acknowledged as a genocide of the 20th century. Religious identity has always been a sensitive trigger behind conflicts between religious groups.
But the proliferation of smartphones, inexpensive mobile data services, and Wi-Fi connections added a new dimension to the nature and propagation of hatred that made rabble-rousing more convenient and deadlier.
Atheists, Hindu minorities who represent about 10 per cent of the population, and a minority religious group called the Ahmadiyya, labeled as ‘heretics’ by mainstream Muslims, have often been targeted by Muslim fundamentalists in the country. And the attacks led to death, physical injury, property loss, and exile.
Facebook is the country’s most popular social media platform, with a 93 per cent user base in Bangladesh. The second most widely used platform is YouTube, with a mere 3.39 per cent usage nationwide.
Facebook has time and again been the main vessel for hateful dissemination of information that conflagrates sensitive issues and helps organized groups spew hate and turn violent. For almost a decade now, Facebook has been used as the main social media space where controversial posts, often uploaded by fake IDs, slowly led to real-world violence. It won’t be wrong to assert that the digital platform seems to be the only consistent element in most registered cases against such attacks.
One of the earlier cases of Facebook-amplified violence that garnered international headlines was in October 2012, when a group of Muslims in the country attacked the Buddhist community after a picture of a Buddhist boy burning the Quran surfaced on Facebook.
The attacks occurred in the Cox’s Bazar region, where a sizable Buddhist minority lives. Buddhists comprise less than one percent of Bangladesh’s population; they are based in Cox’s Bazar and adjoining areas about 400 kilometers southeast of Dhaka. Hundreds of Buddhist homes were destroyed, and at least eight temples were ransacked. The violence continued for several days, and the local administration had to call paramilitary forces to get the situation under control.
Such cases of communal violence have become a staple for Bangladesh. Incidents of violence are often sparked on Facebook and brought to the roads by angry mobs.
Facebook’s perennial hate problem
This is not the first time Facebook has been accused of riling up crowds with hate speech against vulnerable communities. The most telling set of events that led to the suing of the social media giant over unbridled space for spewing hate was the Rohingya genocide in 2017, where the Myanmar army killed more than 10,000 Rohingya.
The death campaign directly impacted Bangladesh, which shares a border with the country. Almost 750,000 Rohingya refugees entered Bangladesh in several weeks, running away from the violence.
In a lawsuit filed in 2018, Rohingya refugees in the UK and US sued Meta, the parent company of Facebook, for “amplifying hate speech against the Rohingya people and failing “to take timely action” against such posts. The United Nations has also accused the social media site of being slow and ineffective in reacting to the hate campaign in Myanmar. Facebook has admitted that it has created an “enabling environment” that led to the massacre of the Rohingya people.
An ex-employee of Facebook, Sophie Zhang, made the most damning revelations about the tech giant’s lack of initiative against how its usage affects the global socio-political environment. Zhang was working as a data scientist in the company’s ‘integrity’ division which was established to fight what it calls “inauthentic behavior.” She was fired in 2020 for poor performance. Still, Zhang wrote a damning 7800-word long memo indicting the tech giant with concrete case studies and how several governments blatantly used Facebook across the world to sway opinions before key elections.
The memo and subsequent interviews paint a stark picture of Facebook’s inner workings vis e vis the platform’s response to unethical manipulation in various countries.
In an interview that appeared in the Guardian, Zhang concluded that: “Facebook doesn’t have a strong incentive to deal with this, except the fear that someone might leak it and make a big fuss, which is what I’m doing… The whole point of the inauthentic activity is not to be found. You can’t fix something unless you know that it exists.”
A company spokesperson, Liz Bourgeois, told Guardian that it fundamentally disagreed with Ms Zhang’s “characterization of its priorities and efforts to root out abuse on our platform.”
“We aggressively go after abuse worldwide and have specialized teams focused on this work. As a result, we’ve taken down more than 100 networks of coordinated inauthentic behavior. Around half of them were domestic networks that operated in countries around the world… Combatting coordinated inauthentic behavior is our priority.”
But the problem is very much embedded in the culture of connectivity that Facebook offers its users. In the Rohingya hearings, it was made public that Facebook hardly employed any resources in other languages when monitoring content on the platform.
With a global reach of billions of people, Facebook has not invested enough resources in global languages, which has led to the loss of lives worldwide.
Bangladesh fails to tackle the hate
Bangladesh has been struggling to curb the spread of hate-filled content on Facebook for some time now. In 2017, the Counter-Terrorism and Transnational Crime (CTTC) cyber unit of Bangladesh identified 2500 Facebook groups that are active in spreading hate against various communities in the country and sought Facebook’s help to block them.
Their biggest challenge is the proliferation of fake Facebook IDs behind such content. Facebook did block some of the pages the country’s law enforcement agencies pointed out, but the numbers were dismally low in the early years.
Bangladesh’s request to Facebook for user data has been growing incrementally over the past seven years. In the first half of 2020, Bangladesh sought information on 241 user accounts, and Facebook accepted 44 per cent of them.
But curbing the spread of hate content by sharing user information with the government is not the best way to tackle the nuances. As right groups warn, Bangladesh is, by all, a one-party dictatorship that is known for hounding critics and dissents.
Since 2009, when the present ruling political party, Awami League led by Sheikh Hasina, took power till September 2021, at least 605 citizens have been picked up by law-enforcement agencies, according to Odikar, a rights group. Most of the cases of enforced disappearance are political activists of opposition parties.
With such a track record, the Human Rights Watch (HRW), in a report, called for sanctions against law-enforcement agencies involved in “disappearing” citizens with impunity, sharing user data with the government can lead to arrests, intimidation, and torture.
Bangladesh introduced the Digital Security Act 2018, which allows law enforcers to detain suspects without a warrant, and hundreds of political and rights activists have been arrested since its introduction.
Amnesty International says the law “imposes dangerous restrictions on freedom of expression”. Sharing user information with a government bent on squashing opponents can be counterproductive when dealing with curbing hate speech on Facebook.
Formal Facebook
For many in Bangladesh, the internet essentially means Facebook. The reach and interconnection of the social media giant in the country is so pervasive that Facebook covers the length and breadth of the internet.
In 2021, the number of users increased by 10 million owing to the surge in internet activity during the Covid lockdowns. In terms of gender parity, 30.9 per cent of users are women while 69.1 per cent are men. Almost 50 per cent of them are between the ages of 18 and 24 years.
But Facebook’s Messenger witnessed exponential growth during the same period. In April 2020, 9.3 million people used Messenger in the country but in a span of one year, the number of users increased four and a half times to 42.1 million.
Facebook also plays a huge role in boosting outreach for small businesses in the country. According to a 2018 survey, more than 300,000 small businesses are operating on Facebook. In a country, where 55 per cent of the population is self-employed, Facebook is a massive equalizer in terms of access to business opportunities and growth.
In an attempt to generate revenue from big tech platforms like Google and Facebook, the Bangladeshi government developed a draft policy to introduce VAT based on 15 per cent on all sale proceeds from the country. The draft also insists that tech giants should store user data in local facilities and share details of accounts with the government when asked.
The draft law also presents a code of conduct for social media platforms operating in the country. Among measures, blocking content and users who violate the code should be carried in within 72 hours of intimation by the government.
Some of the wording of the draft leads to the question of tightening free expression as it also instructs platforms to inform users that they cannot host, display, publish or share any information that would threaten the unity, integrity, defence, security, or sovereignty of Bangladesh, friendly relations with foreign states, or public order, or cause incitement to the commission of any cognizable offense or prevent investigation of any offense or insulting to other nation.
When a local newspaper reached Facebook for comments on the proposed law, Facebook said it was still reviewing the draft, and hoped that any new rules for the internet in “Bangladesh will respect international best practices on safety, privacy, and freedom of expression, and create an environment conducive to innovation, investment, and growth.”
Bangladesh had been insisting Facebook set a formal presence in the country and open an office to deal with various issues it faced. The tech giant operates in the country through regional offices. Having a local office will naturally lead to deploying more resources to Bangladesh in dealing with hard-pressed problems like instigating communal violence. But despite assuring the Bangladeshi government in various meetings held in Singapore, Facebook reiterated in 2019 that it had no plans to open a country office in Bangladesh.
Facebook’s reluctance to set up an office in the country can be attributed to the proposed law, which Facebook would have to pay if registered as a local company. However, in September 2021, in a video conference with the Postal and Telecommunication minister, Facebook appointed Sabhanaz Rashid Diya, a Bengali-speaking employee of the company, as the officer for ‘Bangladesh affairs’ for “speedy resolution of any existing problem regarding content,” as the news report announced.
Conclusion
For many in Bangladesh, the internet is Facebook. The ecosystem that Facebook offers is the total length and breadth of its digital experience. Dealing with a country like Bangladesh is never easy, especially when a government is known for abusing its powers against citizens. Sharing user data with the government can push ordinary citizens into harm’s way.
Given Facebook’s enormous global reach, the social media giant now wields power in shaping the lives of billions of people on the planet. The best scenario for Facebook to deal with the spread of hate-filled content in Bangladesh is to ensure better monitoring systems are in place. Use AI tools to develop processes that can identify potential groups and IDs likely to post offensive material on their platform. As we know, Facebook was used as a vessel to manipulate poll results and bolster groups to attack vulnerable communities worldwide.
Facebook has to invest more resources in developing teams that understand its users’ many languages and ensure that posts that spread hate and violence are curbed at any cost.
If Facebook does not do that with utmost priority, then Sophie Zhang’s characterization that the company has had “blood on its hand” only rings true.
This report is part of DRM’s exclusive journalism series exploring Big Tech’s failure to contain hate speech and lack of corporate accountability across Asia.