The European Union (EU) has directed Meta, the parent company of Facebook and Instagram, and Snap, which owns Snapchat, to provide details on the measures taken by them to protect children, according to the European Commission.
Both companies must submit comprehensive responses to the EU by December 1 on how they protect children from potentially harmful and illegal content.
The development comes amid heightened scrutiny of social media platforms by state regulators in relation to child protection. Leading platforms, including TikTok and YouTube, have become the focus of child protection watchdogs in both Europe and the United States.
Last week, YouTube and TikTok were sent requests by the EU to submit details regarding their measures to ensure child protection online. The video platforms have until November 30 to respond to the notices.
The regulator has ramped up its scrutiny of tech platforms after Europe’s new set of cyber rules, the Digital Services Act, came into effect in August, 2023. The law applies to a range of social media platforms, with those having over 45 million monthly active users labelled as “very large platforms” under the legislation.
If the Commission is not satisfied with a given company’s response, it can launch an investigation into its practices. Under the DSA, tech platforms that stand in violation of the rules may face a permanent ban and a fine amounting to up to 6 six per cent of their global turnover.
Meta, in particular, has been facing critical accusations of prioritising business over the safety of children. The revelations that began with the high-profile 2021 testimony of the whistleblower, Frances Haugen, were echoed in the disclosure made by another former Meta employee this month. The whistleblower, named Arturo Bejar, testified before the US lawmakers that Meta was aware of the harassment and harm facing children on Instagram, but the company did not take any meaningful measures to address the problem.
Haugen’s claims, supplemented by internal company documents, too revolved around the allegedly detrimental practices at Instagram and how the platform continues with its practices that have visibly proven to demonstrate a harmful impact on younger audiences on the app, which is highly popular among users aged between 13 and 17.
The internal documents also revealed Meta’s discriminatory content moderation practices, with 87 per cent of the company’s overall budget on classifying misinformation reportedly allocated for the US alone.