The European Union has fined Meta Platforms €200 million for violating the Digital Markets Act (DMA), citing its “consent or pay” model as non-compliant with user rights. Under this model, Meta asked users to either allow tracking for personalised advertising or pay a subscription fee, a choice the European Commission found coercive and not in line with the DMA’s requirement for informed consent.
The DMA, introduced to limit the influence of large tech companies designated as “gatekeepers,” aims to ensure fair competition and user autonomy in the digital market. The fine is one of the first major enforcement actions under the new law and is expected to influence global regulatory discussions around platform accountability and user privacy.
Beyond the legal breach, the decision highlights wider concerns about the impact of social media business models. Researchers, including a 2023 study by the Centre for Humane Technology and the University of Oxford, have found that algorithms designed to maximise engagement often prioritise emotionally charged or divisive content. This has been linked to increased polarisation, misinformation, and mental health challenges, particularly among younger users.
In the Global South, governments are beginning to explore similar regulatory frameworks. However, implementing these without robust rights protections raises concerns. Among other elements, data localisation laws which require companies to store data within national borders are increasingly being adopted in countries like Pakistan. Critics warn that, without democratic safeguards, such measures can enable surveillance and restrict online freedoms.
In most Global South countries, where digital regulations are already used to monitor and restrict online spaces, any move toward platform regulation must be approached carefully. Overregulation could risk limiting freedom of expression, while underregulation leaves users vulnerable to harms.
Online safety also intersects with gender. Women and gender minorities face higher levels of digital harassment and privacy violations. Yet, content moderation often fails to address these challenges effectively, sometimes even removing advocacy content while overlooking abuse. As regulation evolves, it will be important to ensure protections are inclusive and sensitive to these disparities.
The EU’s fine on Meta sets a precedent, but its global impact will depend on how other countries interpret and implement similar rules. For developing democracies, the challenge lies in finding a balance: protecting users from platform harms without undermining fundamental digital rights.