Wednesday, December 31, 2025
Digital Rights Monitor
  • DRM Exclusive
    • News
    • Court Updates
    • Features
    • Comment
    • Campaigns
      • #PrivacyHumSabKe
    • Vodcasts
  • In Media
    • News
    • OP-EDs
  • Editorial
  • Gender & Tech
    • SheConnects
  • Trends Monitor
  • Infographics
  • Resources
    • Laws and Policies
    • Research
    • International Frameworks
  • DRM Advocacy
    • Exclusives
    • Featured
    • Publications
    • Statements
No Result
View All Result
Digital Rights Monitor
  • DRM Exclusive
    • News
    • Court Updates
    • Features
    • Comment
    • Campaigns
      • #PrivacyHumSabKe
    • Vodcasts
  • In Media
    • News
    • OP-EDs
  • Editorial
  • Gender & Tech
    • SheConnects
  • Trends Monitor
  • Infographics
  • Resources
    • Laws and Policies
    • Research
    • International Frameworks
  • DRM Advocacy
    • Exclusives
    • Featured
    • Publications
    • Statements
No Result
View All Result
Digital Rights Monitor
No Result
View All Result

in DRM Advocacy

From Colonies to Code

Sadaf Khanby Sadaf Khan
March 24, 2024
From Colonies to Code

“To be oppressed means to be deprived of your ability to choose” says bell hooks.

In the modern day,  no force affects the ability to choose more profoundly than big tech. These global tech platforms, with their all-encompassing access to users’ preferences,and their ability to control distribution of knowledge and information, have raised serious concerns regarding users’ personal, social, and political choices. Ironically, these same platforms are often used to inform personal choices – promising abundance: abundance of information, abundance of choice – but, look closely, and you see a different truth emerge. The algorithms that power this abundant information have little to do with the user’s choices – they are designed not to empower users but merely to maximise engagement by reinforcing confirmation biases and profiting from cognitive dissonance.

The environments these platforms create appear vast and diverse, but they are carefully crafted traps; confining users within pre-set boundaries of thought and behaviour. Every click, every comment, every emoji that we send, feeds algorithms that do not liberate but exploit – shaping what we see and hear to align with profit-driven motives. This illusion of choice is no freedom at all. Beneath the semblance of control, we, the users, remain mere commodities, stripped of meaning, and turned into data. In economic terms, our participation on platforms is free labour, which enriches tech conglomerates while leaving users with no way to ensure reward, transparency or accountability.

In this cycle of exploitation, it is not just choice that is lost—it is the ability to imagine what true freedom could look like. This is the hallmark of oppression: the quiet transformation of abundance into servitude, the subtle shift from user to labourer, and the seamless normalisation of systems that extract and exploit.

This is not a choice, this is control.

And big tech has borrowed much from the historical systems of control. The world is post-colonial no more (if it ever was); Big Tech’s pervasive influence mirrors the exploitative patterns of historical imperialism and colonial regimes, particularly impacting vulnerable communities in the Global South.

Consider one of the most insidious tools of colonisers; erasure.

Colonial regimes thrived on obliterating Indigenous languages, histories, and epistemologies, replacing them with their own imposed systems of knowledge. Erasure was not incidental; it was deliberate, a calculated act of violence against culture and identity. Big Tech platforms follow a similar blueprint. Their global policies—shaped completely through a Western lens—sideline non-Western ways of knowing and being. Take any platforms’ community guidelines, for example. These guidelines impose homogenous standards on a profoundly diverse world, stripping away cultural nuances. Successful participation on this platform is conditional—compliance with these norms is mandatory, effectively perpetuating a new form of digital colonialism. In this virtual empire, local contexts and identities are erased, rendered invisible in favour of dominant narratives. Erasure, once the hallmark of colonial empires, now thrives in the algorithms and policies of the digital age. Linked to erasure is another tool used by all colonisers;  cultural assimilation and the imposition of Western values.  Colonial powers stripped away local traditions by enforcing European norms in dress, behaviour, and societal roles, breaking communities and asserting control. Today, we see the same patterns repeated in the realms of digital platforms. .

Algorithms on platforms like Instagram often elevate trends that align with normative ideals of the West, rewarding creators who neatly fit into the mould they construct. Visibility and engagement come with a price: the surrender of authenticity. Use the trending music, mimic the viral dances, master the science of hashtags only then does your voice become known in a sea of competing voices. Alternative practices of knowledge-sharing and creativity are not just overlooked; they are systematically pushed aside, deemed less worthy, less valid, less deserving of space in this homogenised digital landscape. What emerges is a digital monoculture, a space where many creators and users find themselves increasingly alienated from their own cultural and communal roots, forced into a narrow framework that mirrors the alienation fostered by colonial systems.

Today, platforms are not merely reflecting global culture; they are actively shaping it, and in doing so, they are guilty of erasing local identities and practices along the way. While some individuals carve out their own digital spaces, this autonomy is frequently afforded by privileges such as education and financial resources. The digital age, far from being the promised realm of liberation, reimagines old patterns of domination and suppression. Platforms serve as instruments enforcing these hierarchies, benefiting the privileged while marginalising the oppressed.

No critique of platforms as a reimagining of the colonial project is complete without addressing economic exploitation, a defining feature of colonialism. Historically, imperial powers extracted natural resources and exploited labour to sustain and expand their economies. Today, Big Tech’s empire operates in much the same way, with resource extraction forming the backbone of its infrastructure. Companies like Google and Meta rely on vast data centers that are built using servers containing rare earth minerals —materials often mined under exploitative conditions in countries across Africa and Asia. But the exploitation doesn’t stop at physical resources. Platforms thrive on an equally insidious form of extraction: harvesting user data. Creators and users provide free labour through their content and interactions, which are monetised by these platforms with little to no equitable compensation. This model mirrors colonial systems like the encomienda under the Spanish conquest, where labour was extracted under the pretence of mutual benefit but served only to enrich the colonisers. And so, the platforms operate as instruments of control, pulling wealth and power closer to the centre—the Global North—while those on the margins, in the Global South remain trapped in a structure that exploits their labour yet denies them fair control or reward.

But this harm does not end with the systems of extraction and control; it manifests daily by amplifying inequalities, destabilising social, political and democratic systems, and endangering women who navigate digital spaces fraught with violence and silencing. The platforms often reinforce patterns of those in the Global South bearing the heaviest burden of the platforms’ unchecked power.

The platforms often reinforce patterns of exploitation and marginalisation, with those in the Global South bearing the heaviest burden of unchecked power. The evidence is clear: Big Tech’s actions—and calculated inactions—have compounded censorship, disinformation, and https://www.reuters.com/technology/un-chief-tells-consumer-tech-firms-own-harm-your-products-cause-2024-06-24/ to structurally and historically marginalised communities, particularly in politically fragile contexts. These impacts are not anomalies; they are patterns that demand urgent accountability and systemic change.

Looking at just one region, Asia, gives numerous examples of how the entanglement of Big Tech in the erosion of political and social movements.

The complicity of these platforms in amplifying hate speech and misinformation is stark in countries like Bangladesh, Myanmar, and India. In Bangladesh, Facebook’s algorithms have not merely reflected societal biases but exacerbated them, amplifying hate speech that has resulted in real-world violence against religious minorities. During the Rohingya crisis in Myanmar, the platform’s role became undeniably criminal, as UN investigators documented Facebook’s involvement in enabling military-led campaigns of incitement and hate speech that fuelled genocide. Meanwhile, in India, platforms have exposed their inconsistencies in content moderation, bending under government pressure and deepening political polarisation and societal divisions.

The myth of Big Tech’s neutrality crumbles when examining its suppression of voices from Palestine and its silencing of those who document and criticise Israel’s human rights violations. No other conflict so clearly illuminates the platforms’ complicity with imperialist and fascist powers. This is not a question of negligence; it is a deliberate alignment with the centres of global power. And this alignment is not confined to the global stage. In authoritarian regimes like Turkey and Egypt, platforms regularly comply with government censorship and data sharing demands, colluding in ways that silence dissidents and endanger those who resist.

These examples, a mere fraction of the multitude, reveal the platforms’ true nature—not as neutral arbiters, but as active participants in systems of oppression.

The question, then, is why?

Why do platforms like Facebook and X enable harm, silence dissent, and deepen oppression? The answer lies in their capitalist core—a system that prioritises profit above all else, no matter the human cost. The Facebook Papers, leaked by whistleblower Frances Haugen, reveal this truth with painful clarity. These documents expose how Facebook knowingly amplified hate speech, disinformation, and divisive content because it drove engagement. In this model, hate, agitation, violence all become fodder for the algorithm that exploited human frailty to bulk its revenues.

Profit, not people, remained the goal.

As Haugen testified before the Senate, Facebook repeatedly chose its bottom line over safety, particularly in the Global South, where weak regulatory environments made harm easier to ignore. This is capitalism in its most imperialist guise: exploiting the periphery to enrich the centre.  Platforms like Facebook are not neutral; they are active agents of a capitalist system that has perfected the tools of imperialism for the digital age. This is not innovation; it is exploitation dressed as progress.

What of the solutions then?

Discussions about platform accountability often focus narrowly on content as the primary driver of harm, while neglecting the larger systemic forces at play. Content moderation is undeniably important, but this narrow lens obscures the deeper roots of the issue: the platforms themselves. By failing to adequately interrogate the economic structures and design choices underpinning these platforms, this approach fails to confront questions of power . Without examining why platforms are allowed to grow into unchecked empires, why their opaque algorithms remain shielded from scrutiny, and why their expanding influence is not recognised as a political force with profound implications for equity and justice, it is unlikely that the patterns of harm will change.

We know there are no perfect solutions, but perfection isn’t the real challenge. The issue lies in who sets the standards for regulation and accountability for these platforms, and from where. Global standard-setting powers, whether regional entities like the European Union or multilaterals like UNESCO, approach these issues through a predominantly Western lens. This lens creates a deliberate blindness to the realities of the Global South. It assumes that governments always act in good faith, that regulation protects rather than oppresses, and that accountability can exist in systems riddled with authoritarian control. But in the South, where oppressive regimes use the language of regulation to crush freedoms, voices and in particular dissent, the stakes are entirely different.

Here, the same tools meant to safeguard rights are weaponised to violate them.

A model that does hold some promise for adaptation in the Global South is the Safety by Design framework. At its core, Safety by Design centres user safety and rights in the development and design of products and services, challenging platforms to prioritise protection over profit. For platforms, this would necessitate a fundamental shift in algorithmic focus—one that values people over profit. While there have been notable iterations of this model, such as the one introduced by the eSafety Commissioner of Australia, its application to the South requires more than mere replication.

To effectively address the deep inequalities between the Global North and South, we must reimagine Safety by Design principles through a feminist and intersectional lens. This approach acknowledges the diverse harms faced by marginalised communities and integrates their experiences directly into the framework.  Design justice, as articulated by scholars like Sasha Costanza-Chock, emphasises the importance of involving affected communities in the design process to ensure equitable outcomes. That is to say that attempts to eridacte the harms being created by the tech platforms cannot be effective unless those from the margins have the voice and the power to contribute.

To truly respond to the diverse realities of the Global South, platforms must abandon one-size-fits-all approaches to harm and safety. They must embrace contextual sensitivity, crafting tools and policies that resonate with local contexts, ensuring that every user feels seen and protected. Recognizing the intersecting identities of marginalized communities is crucial; their experiences of harm are multifaceted and cannot be addressed through monolithic frameworks.

And accountability must happen at a structural level, platforms demystify their algorithms and content moderation practices, tailoring them to serve, rather than undermine, these communities. Without such transparency, platforms risk replicating the very hierarchies they claim to dismantle.

Safety, as bell hooks would remind us, is not just about protection—it is about creating a space for growth and transformation. Resisting digital imperialism and colonisation necessitates a transformative shift in how platforms conceptualize and implement safety measures. Harm, would have to be seen through lenses that are not just white and western. The Safety by Design framework, while foundational, must evolve to prioritize the lived experiences of users in the Global South.

This evolution demands a departure from homogenized, global approaches that often overlook local nuances, in favor of strategies that are deeply informed by the socio-cultural and political contexts of these communities.

Previous Post

PAKISTAN: X banned on interior ministry’s orders, telecom regulator tells court

Next Post

SPAIN: Telegram blocked temporarily over complaint by media companies

Share on FacebookShare on Twitter
From Screens to Streets: Women Vloggers in Pakistan Face Harassment Everywhere

From Screens to Streets: Women Vloggers in Pakistan Face Harassment Everywhere

December 30, 2025
The Uncomfortable Truth: How Online Lies Endanger Transwomen in Real Life

The Uncomfortable Truth: How Online Lies Endanger Transwomen in Real Life

December 24, 2025
Out, Loud, and Proud: Inside the Activism Reshaping Online Narratives on Animal Rights

Out, Loud, and Proud: Inside the Activism Reshaping Online Narratives on Animal Rights

December 17, 2025
No Content Available

Next Post
SPAIN: Telegram blocked temporarily over complaint by media companies

SPAIN: Telegram blocked temporarily over complaint by media companies

About Digital Rights Monitor

This website reports on digital rights and internet governance issues in Pakistan and collates related resources and publications. The site is a part of Media Matters for Democracy’s Report Digital Rights initiative that aims to improve reporting on digital rights issues through engagement with media outlets and journalists.

About Media Matters for Democracy

Media Matters for Democracy is a Pakistan based not-for-profit geared towards independent journalism and media and digital rights advocacy. Founded by a group of journalists, MMfD works for innovation in media and journalism through the use of technology, research, and advocacy on media and internet related issues. MMfD works to ensure that expression and information rights and freedoms are protected in Pakistan.

Follow Us on Twitter

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

No Result
View All Result
  • DRM Exclusive
    • News
    • Court Updates
    • Features
    • Comment
    • Campaigns
      • #PrivacyHumSabKe
    • Vodcasts
  • In Media
    • News
    • OP-EDs
  • Editorial
  • Gender & Tech
    • SheConnects
  • Trends Monitor
  • Infographics
  • Resources
    • Laws and Policies
    • Research
    • International Frameworks
  • DRM Advocacy
    • Exclusives
    • Featured
    • Publications
    • Statements