On October 12, 2021, the Intercept published a list of over 4000 organisations and individuals that Facebook (now Meta) blacklists on its platform – a list that it curated under its Dangerous Individuals and Organisations (DIO) policy. The list had, until this point, been secret. Facebook has refused to share it publicly, claiming that it could endanger employees and enable the blacklisted entities to circumvent the policy. On this list, The Intercept reports that Facebook has banned “politicians, writers, charities, hospitals, hundreds of music acts, and long-dead historical figures.”
The company, in its Dangerous Individuals and Organisations Policy, states that, “We remove praise, substantive support and representation of various dangerous organisations.” According to this policy, not only affiliation with the blacklisted groups and individuals warrants a block on the platform, but glorifying, legitimising and aligning ideologies with these groups also lead to removal from Facebook platforms.
The leaked list, which entails proscribed entities from around the world, features over 100 names specific to Pakistan. However, in Pakistan, these organisations and individuals operate freely on Facebook, with dozens of examples of terrorist organisations like Sipah-e-Sahaba Pakistan (SSP) and Lashkar-e-Taiba (LeT) maintaining a presence on the platform, according to our own independent research.
The continued active presence of these two groups, for example, on the platform demonstrates that the list, and subsequently the policy, is poorly implemented. There is a longstanding issue of Facebook’s content moderation in languages other than English that lets a lot of hate speech continue to remain on the website. But the scope of the issue became apparent when blacklisted names such as SSP and LeT appeared not only in Urdu or Roman Urdu but also in the English language. Hundreds of accounts, pages and groups reflected positive endorsement for, affiliation to and in some cases claimed to officially represent, these and many other listed terrorist organisations on social media. The failure to moderate to this content even when there is an established policy and set of guidelines allows for a space where terrorist individuals, organisations and their radical ideologies are applauded and eulogised.
This kind of support for extremist organisations has worrying consequences on both politics and the day-to-day lives of regular people. Haroon Baloch, who researches hate speech at Bytes4All Pakistan, notes that the prevalence of extremist content in online spaces is growing fast in Pakistan. “The sources of hate speech online, which have resulted in incitement of violence against religious minorities, have often been linked to dangerous and banned outfits. These online activities directly hurt religious minorities by positioning motivations of hatred in a religious justification.” Every year since 2013, his organisation has recorded an exponential increase in online hate speech against religious and sexual minorities. In Pakistan’s case, this means individuals and organisations that propogate a hateful and violent interpretation of Sunni Islam which is hostile towards religious minorities such as Shias and Hindus, has accumulated a massive following on Facebook.
Journalist and member of the Shia community, Zoya Anwer Naqvi notes that members of the Shia community are specifically targeted and harassed online by coordinated networks belonging to these dangerous organisations and in some cases this violence spills over in the offline sphere. “What do you do when people who have been harassing you online show up at your doorstep?” asks Naqvi, emphasising the feeling of helplessness shared by targeted communities in Pakistan. Conspiracy theories and hoaxes meant to instil fear and hatred against minorities are commonly shared by the same extremist sources, leading to the creation of an unsafe online sphere. Growing censorship of progressive counter-narratives to religious extremism in the country have only made things worse.
In 2016, American Sociologist Monica Lee authored an internal investigation report for Facebook which found that Facebook’s recommendation features, especially the newsfeed algorithm, were significantly growing the problem of extremism and hate speech in Germany, even though such content violates Facebook’s community standards. The report highlighted that, “64 [percent] of all extremist group joins are due to [Facebook’s own] recommendation tools.”
More recently, leaks of internal company memos revealed outrageous negligence and apathy towards harmful consequences of the platform. Internal documents leaked by whistleblower Frances Haugen states, “We estimate that we may [take action on] as little as 3-5 [percent] of hate and ~0.6 [percent] of V&I [Violence and Incitement] on Facebook.”
Discussing the effects of hateful trends in Pakistan, Haroon Baloch shares how major sources of hate speech against religious minorities can be linked to dangerous organisations mentioned in Facebook’s DIO list such as the Tehreek-e-Labbaik Pakistan (TLP) that are also outlaws in the country. Given Facebook’s business model that is based on maximising profits through ads, it suggests content to users that the algorithm believes will generate most engagement. “The algorithm helps coordinate offline events of hate speech and violence,” says Baloch. As a result, emotionally charged and extremist content tends to dominate, as this is the kind of content that not only users find their interests aligning with, but it also makes them spend the greatest amount of time on the platform. “In order to truly understand online hate speech on Facebook, we must understand the working relationship between big internet companies like Facebook and governments,” Haroon suggests.
Breeding hate in Pakistan through the platform
The so-called dangerous organisations as identified in Facebook’s DIO Policy have, for the longest time, operated offline but owing to the increased digitisation and an influx of their audience moving to social media for affiliations and communications, the groups have reconsidered their outreach strategies and have found a way to connect with potential recruits and interested people on the internet. Facebook being the commonly used platform by their audience, have become a home to many such organisations and individuals looking to spread their radical ideologies, and to target and threaten religious and gender non-conforming minorities online. “I know of several instances where people who have posted about [how] Shia activities have been mass reported by coordinated online networks operated by these dangerous organisations,” remarked Zoya Naqvi, a member of the Shia community, “And this has resulted in taking down the Shia accounts on Facebook.”
One such organisation is Tehreek-e-Labbaik Pakistan (TLP) which is responsible for many violent protests in the country, including the one on November 25, 2017 that led to the killing of multiple law enforcement officials across the country. The TLP created a group called “TLP social media team” on July 3, 2020, and shortly after shared a link to the group through WhatsApp. The writer joined this group and continued monitoring its content until its last activity in March 2021, since then the group has been inactive. Out of the over 200 members, only a handful of group admins were allowed to post in the group and everyone else were only supposed to pick the templates for Twitter trends, calls for mobilisation, or just general ideological content to share forward. These videos more often than not consisted of hateful remarks against Ahamdis and carefully constructed fake news merging various sections of Pakistani politics to rally support against religious minorities.
A screenshot of one of the messages shared on the “TLP Muslim media group” on WhatsApp that claims an Ahmadi man is appointed as a Permanent Representative of Pakistan in the UN by the PTI government.
One of the senior members of the volunteer-led cyber team of Jamaatud-Dawa (JuD), the political wing of Lashkar-e-Taiba (LeT), was reported to have said that their team that consists of camerapersons, editors and IT technicians, expands to over 45 cities and mostly relies on voluntary effort from young students. This team has also organised social media workshops in major cities across the country, mobilising tech-savvy supporters of the movement and ushering them into its fold. This JuD cyber team may still be active on Facebook, as observed for this article how JuD and its associated charities are operating under their own names on the website, as recently as 2021.
Facebook profile endorsing Hafiz Saeed, the co-founder of Lashkar-e-Taiba, under his name.
With access to the blacklist, we set out to check if it worked
To check how effective the DIO list is, we started by isolating the individuals and organisations from Pakistan blacklisted by Facebook. Then we recorded whether the individual or organisation was present on Facebook. “Presence” on Facebook was defined as content that praises, supports, or represents a dangerous group or individual. Facebook’s own guidelines to identify which content will be disallowed on the platform follow a similar rubric. These guidelines go into further depth about how to recognise praise, support and representation. This comparison was done to investigate whether Facebook is effectively implementing its own rules rather than measuring it against an external yardstick.
Surprisingly, a majority of dangerous individuals and organisations, 64 out of 124 specific to Pakistan, have a presence on the social network, which shows a troublesome disconnect between Facebook’s projected commitment to regulating dangerous speech and its inability and negligence in practice to restrict terrorist organisations and individuals from maintaining a presence on its website.
Facebook had kept the list secret and has denied its accuracy after The Intercept published it, so it is not certain how often, if at all, the list is updated. Similarly, the criteria of adding the names to this list is also unknown owing to the secrecy that Facebook has maintained with regards to the DIO list.
At times individuals and organisations designated as terrorists by or facing sanctions from the United States government are included despite never having had a presence on Facebook, since this is required by the U.S law. For instance, Haji Bashir and Zar Jamil Hawala are a money changer from the border town of Chaman, Balochistan. It was placed on global sanctions lists in 2015 for connections with the Taliban. The money changer is unlikely to have ever had a presence on social media since it never appeared on any searches on the social media platform or any mentions in the posts, but it is on Facebook’s dangerous organisations list. The purpose of adding entities that have had no presence on Facebook yet is to block them from ever having one on the platform.
Tehreek-e-Labbaik Pakistan (TLP) and its notable leaders such as Pir Afzal Qadri, Khadim Hussain Rizvi and Saad Hussain Rizvi make appearances on the blacklist but continue to have Facebook pages dedicated to them, and accounts that routinely glorify them through sharing their speeches.
Screengrab from a video of Pir Afzal Qadri shared by a Facebook account.
Several pages in Khadim Hussain Rizvi’s name on Facebook.
Several pages with Saad Hussain Rizvi’s name and images on Facebook.
Surprisingly, Facebook’s inability to limit the representation and praise of dangerous individuals and organisations in English was just as severe as its inability to do so in Urdu. Out of the 44 organisations we surveyed from the list, 26 were found to be present on Facebook, out of which only 2 were ones which only maintained a presence in English and the remaining 24 maintained it in English and Urdu both. This depicts that the lack of content moderation based on the DIO policy that is supposedly a sensitive list than the general content moderation policy of Facebook, is not just restricted to foreign languages like Urdu, but English moderation of DIO content also gets ignored as a routine.
However, support, and particularly representation, of all 26 organisations is present in both languages but it is unclear whether the extent of praise and representation for dangerous groups and individuals was more consistent in Urdu as compared to English. For example, Milli Muslim League (MML) is a political off-shoot of Jamatud Dawa, which was formed in 2017 by Hafiz Saeed to continue similar political activities under a different name, right after Tehreek-e-Labbaik Pakistan (TLP) became active. MML and its leaders Hafiz Saeed and Arif Qasmani – the chief coordinator of LeT, were allegedly involved in the Mumbai Attacks in 2008, the Mumbai train blast in 2006 and other heart-wrenching acts of terrorism against India.
Given its active involvement in terrorist events and activities, the MML is blacklisted under Facebook DIO policy along with Lashkar-e-Taiba (LeT) which is the main party in the wing that Jamaatud Dawa is also part of. Present members of MML, including Saifullah Khalid, Muhammad Ehsan, Muhammad Harris Dar and Muhammad Hussein Gill, are all blacklisted on Facebook for connections with terrorism. Despite the fact that there is widespread condemnation of these organisations in English when it comes to terrorists under whose crosshairs landed India, however, they maintain a significant support base in Urdu on Facebook.
Regardless, Milli Muslim League (MML) has a Facebook page with the party’s full name in Urdu as well as in English. The page that has over 2000 likes, has been inactive since 2018, but its continued presence on the platform since then indicates that Facebook’s DIO policy enforcement is unreliable. When these pages were active, they frequently posted content making promises of bringing in the Shariah Law and engaging the “enemies of Islam” in physical combat.
Milli Muslim League (MML) Facebook pages in Urdu and English still exist despite being identified in the platform’s DIO list.
Some other names from the list are Asma Money Exchangers and Sipah-e-Sahaba Pakistan. Asma Money Exchange was sanctioned by the United States in 2014, and not only does it have a Facebook page in its name but the page also lists the Lahore address for the shop that matches the address found on the US sanctions list. Whereas, Sipah-e-Sahaba was declared a terrorist organisation in 2002 in Pakistan under the Anti-Terrorism Act of 1997, and in the same year was renamed to Millat-e-Islamia Party (MIP), which still has its own Facebook page in English. In 2003, after getting banned as MIP, the party changed their name to Ahle Sunnat Wal Jamaat, which too has its own Facebook page in English.
While carrying out this research, it was found that 18 organisations did not have a presence on Facebook; both in the form of Facebook accounts and pages or posts endorsing them. These include organisations on US terror sanction lists that logistically supported terrorism financing under the guise of commerce such as Rahat Limited, Haji Khairullah Haji Sattar Money Exchange and Haji Basir and Zarjmil Company Hawala – these money exchangers constitute the side of the terrorism operation which is non-public facing and logistical. These have been listed by UN for being involved in terrorism financing, planning and facilitating financial transactions for the Taliban in Afghanistan.
An example of one such proscribed organisation we did not find on Facebook is Al Akhtar Trust, a charity trust formed in the year 2000 and used to fund Lashkar-e-Taiba, Lashkar-e-Jhangvi and the Harakat-ul-Mujahideen whose Facebook group is active in the present. In 2007, Al-Akhtar Trust changed their name to Pakistan Relief Foundation, and its a Facebook page with over 11,000 followers maintains an active social media presence.
Other organisations that did not appear in Facebook searches were either those who were operating a long time before Facebook was created or those whose operations largely existed outside of Pakistan. However, due to seldom occurring instances of collaboration with homegrown terrorists, they have been listed as Pakistani organisations.
In the category of Individuals in Facebook’s DIO list that blacklists 80 people from Pakistan, there was a clear distinction in the public-facing figures (the ulemas, the leaders, Islamic scholars) who engaged with the people directly and personally on a regular basis to keep up a grass-roots network, and people who were involved in activities like low level command, fighting and logistics. A lot of the people we did not find on Facebook belonged to the latter category.
A fan page dedicated to Al Qaeda leader Abdul Aziz Nuristani plainly titled “شیخ عبدالعزیز نورستانی”, [“Sheikh Abdul Aziz Nuristani” in Urdu script], has not been taken down despite having existed since 2016. Since Abdul Aziz Nuristani’s name is on Facebook’s dangerous blacklist, the existence of this page since the past 6 years raises questions on the effectiveness of this blacklist. One could argue that it is in fact a failure of Facebook to implement its hate speech policies in this part of the world. This failure to take down the fan page could also reflect a complete inability or disinterest on Facebook’s part to implement limitations on these dangerous individuals and groups in languages other than English.
The individuals which were not present did not indicate any particular concern on Facebook’s part but were those who were seldom reported in news for their involvement in terrorist activities.
We found equal number of public-facing and non-public facing individuals linked to terrorist groups. This indicates that it is public interest dictating which of the blacklisted individuals/organisations are not present on Facebook, rather than any particular concernment on Facebook’s part.
Support and representation for terrorists is commonplace. Is Facebook inept or negligent?
Dangerous individuals and organisations associated with terrorist activities being allowed to freely use Facebook has predictably led to them acquiring support, both in the form of financial and human resource, on social media. For example, Falah-e-Insaniat Foundation – the charity wing connected to Lashkar-e-Taiba, which is blacklisted under Facebook’s DIO list, is represented in both Urdu and English on the social network through a public group with the same name in English. An account named after Hafiz Saeed – the co-founder of Lashkar-e-Taiba, posted a video captioned “Falah-e-Insaniat Foundation” in Urdu on Facebook advertising the work of the foundation in October 2021 and asking for donations and support from the followers, which continues to exist on the platform.
Falah-e-Insaniat Foundation Facebook group.
Allowing activities of such groups on a platform like Facebook that reaches millions can have devastating consequences in a country like Pakistan, where violent right-wing groups have claimed thousands of lives and are constantly looking for avenues to dispense their ideologies and recruit new members. Figures from Pakistan’s sectarian far-right politics feature prominently on the list, which holds testimony to how social media platforms like Facebook enable effective communication between these groups and the public at large.
However, another question that these lists raise, particularly after the takeover of the Afghan government by the Tehreek-e-Taliban which is a designated proscribed organisation blacklisted under the DIO list, is that how would these social media companies strike a balance between political and extremist speech when a sitting government or member of government official makes an appearance on the said list. In the case of Pakistan, Muavia Tariq, a member of Punjab’s provincial assembly and the son of the co-founders of Sipah-e-Sahaba Pakistan, is designated as a dangerous individual in Facebook’s list for his connections with the terrorist organisation. Adding a sitting member of a government in the list of dangerous individuals raises content moderation dilemmas not new to Facebook, but have in fact been seen and discussed at the time when former President of the United States Donald Trump was being banned from the platform. The company, including its CEO Mark Zuckerberg, has long maintained that it will not police the speech of politicians on its platforms, and has kept a hands-off approach. It termed this speech “newsworthy” and in public interest that should be heard and seen by everyone, and hence should not be moderated. However, in a blogpost in 2019, Nick Clegg, the VP of Global Affairs and Communication at Meta, said, “Content that has the potential to incite violence, for example, may pose a safety risk that outweighs the public interest value.”
Owing to the volatile political landscape in this part of the world, various inclusions in DIO list pose a unique question: Would Facebook silence elected and government officials on its platform?
In the case of Muavia Tariq, however, he continues to maintain a definite presence on Facebook in both English and Urdu despite his inclusion in the DIO list. Countless fan pages, fan videos and sermons mentioning and glorifying his name (matching the spelling on Facebook’s list) in their titles have not been taken down. Where there are questions regarding whether Facebook should or should not silence a government official, there are also questions regarding the effectiveness and implementation of the DIO list.
These examples are testament to Facebook’s contradictory policy implementation when it comes to limiting dangerous speech, individuals and organisations. There are now over 46 million Facebook users in Pakistan who are potentially being exposed to terrorist linked organisations. The same organisations are mobilising on Facebook and often engage in inciting speech online which can and does translate to real-life violence. When such a platform is introduced to a deeply divided society like Pakistan, it acts as a crowbar being wrenched into already deepening cracks prying them apart even farther.
Content moderation and the Facebook-State nexus
On the relationship of the state and the platform in Pakistan and with regards to Facebook’s content moderation policies we spoke to Fareiha Aziz, Director of Bolo Bhi, a digital rights advocacy group. “The state works with Facebook to deplatform banned organisations and individuals however in Pakistan, anti terrorism laws and hate speech laws are used to silence dissident voices rather than regulating hate speech.” Commenting on the inadequacy of the current mechanisms for content moderation, she added, “The government is also pressurising the platform which wants to curtail hate speech, resulting in over regulation and over monitoring. So in the case of Burhan Wani, several people reported that their posts were disappearing. Just mentioning someone should not warrant removal from the platform, this erasure doesn’t take care of the problem, you are only shutting down discourse.”
Pakistan Electronic Crimes Act (PECA) introduced in 2016 has sections 10 and 11 pertaining to cyber terrorism and hate speech which have been time and time again misused to silence voices. Recently, the Islamabad High Court has also taken notice of this abuse of law by the Federal Investigation Agency (FIA). Companies like Facebook not only have to follow legal requirements of maintaining presence in a country like Pakistan but also respect regional socio-political dynamics and laws which apply. This nature of compliance can result in a relationship which only leads to an unsafe cyberspace for religious minorities and gender non-conforming minorities in Pakistan. Aziz says, “You could report something but you need prompt action and review and that doesn’t happen easily given the insufficient understanding of local language and context. There has been a conversation with Facebook that there needs to be more local language moderators but the response is really slow and in comparison not many human resources have been dedicated to the task.”
As a result of a decade-long War on Terror in Pakistan, by 2018, terrorist attacks have claimed more than 65,000 lives since the turn of the century. It is important to keep a close eye on Facebook in Pakistan’s case given the shift of deeply embedded radical networks towards the internet to mobilise and strategise, and owing to the role of Facebook around the world in becoming a tool for violence. The government’s varied relationships with these banned outfits is also instrumental towards constructing the cyberspace which exists today. As pre-internet methods of organising violence such as Friday sermons and neighbourhood mosque meetings have come under greater state scrutiny, organisers have migrated online. The question is not whether Facebook is being used to facilitate and promote real life violence, it is unquestionably being used to do that as evidenced by the well-documented anecdotes from India, Myanmar, United States, and many other countries around the world. The question is, what is Facebook doing to limit the reach of dangerous individuals, organisations and speech on its platform that has caused real life violence in the world. And so far the answer remains: not enough.