The European Union (EU) lawmakers have supported a set of stricter rules that will require Big Tech companies to counter child sexual abuse material (CSAM) on their social media platforms.
The legislation, which was agreed upon last week, will target leading social media firms, including Alphabet’s search engine giant Google, and Meta, which owns popular social networking platforms Facebook and Instagram, as well as the messaging app WhatsApp.
The legislation, proposed by the European Commission last year, has been a divisive discussion among the advocates of online safety as it concerns end-to-end encryption of some online platforms. The rules have raised concerns they might enable state watchdogs to exercise illegal surveillance, which could undermine data privacy.
The legislation will oblige tech companies to provide reports detailing their management of content depicting child sexual abuse. The prohibited material includes both photos and videos. The instances of grooming on online platforms will have to be tackled with more stringent measures as well.
The said reports will be shared with the police through the EU Centre on Child Sexual Abuse, which will act as a consultative body on the cases pertaining to child abuse and exploitation. To address surveillance worries, lawmakers have allowed judicial authorities to issue temporary detection orders, which will be implemented to identify and take down exploitative content.
The regulatory scrutiny around Big Tech has intensified in the EU following the enforcement of the Digital Services Act (DSA), which makes online child safety a top priority for tech firms. The companies could face bans and fines running up to six per cent of their global turnover in case of violations.
The regulators have tightened the noose around Big Tech companies in the US, too. Last week, the CEOs of leading companies, including Snapchat owner Snap Inc., Discord, and X (formerly Twitter), were subpoenaed to appear before the Senate Judiciary Committee to testify regarding their “failures to protect children online”. The CEOs of Meta and TikTok are expected to testify voluntarily. The hearing will take place on December 6, 2023.
Leading social media companies, including Meta, TikTok, X, and Google, have become the focus of privacy watchdogs’ glare in both Europe and the US. In October, TikTok and Snapchat, among other online services, signed a pledge to tackle child sexual abuse material generated through generative artificial intelligence (AI). The move arrived after a nonprofit, which works to support child abuse victims, rang the alarm bell that AI could significantly accelerate violence against children on the internet.