The European Union (EU) has expressed reservations over the reduced number of content moderators at X, formerly known as Twitter.
The shortage of moderators raises questions regarding X’s ability to comply with the EU’s online regulations.
As pointed out by a senior European Commission official, X currently has only 2,294 content moderators. The number is particularly concerning for a platform that reportedly has over 520 million monthly active users around the world.
The number of moderators at the company declined sharply following its takeover by billionaire Elon Musk in October 2022, attracting widespread attention from digital safety experts and regulators. Several crucial teams responsible for addressing disinformation and hate speech were disbanded during mass layoffs.
On the other hand, YouTube has 16,974 moderators and TikTok about 6,125, according to reports submitted in September to the EU by their parent companies Alphabet and ByteDance, respectively. Google’s app store, Google Play, has 7,319 moderators.
Last month, X was fined by Australia’s eSafety Commissioner for not cooperating with the authorities in a probe focused on the platform’s practices to tackle child abuse. X failed to provide details on how it counters, detects and responds to child abuse material.
Since Musk’s $44 billion acquisition of X (known as Twitter at the time), a large volume of research released by civil society organisations and academic institutions has suggested that hate speech has increased drastically on the platform. Musk has, however, repeatedly claimed that hateful content on X has declined under his management.
The regulatory scrutiny around leading social media platforms has significantly risen following the enforcement of the Digital Services Act (DSA) in the EU. The law imposes stricter regulations on tech platforms, demanding more effective measures to tackle potentially harmful and illegal online content.
Under the legislation, sites with over 45 million monthly active users are labelled as “very large platforms”. Among the leading platforms the DSA targets are TikTok, Facebook, Instagram, Google, and X.
Early this month, the EU sent notices to TikTok and YouTube, directing them to explain how they ensure the protection of children against harmful and illegal content online. Meta and Snap, the owner of Snapchat, have been asked to explain their child safety measures, too.
Violations under the DSA could cost a given company six per cent of its global turnover. The DSA is known for its stronger emphasis on the safety of children on social media platforms and aims to make regulation of Big Tech platforms more transparent and effective.