Saturday, September 13, 2025
Digital Rights Monitor
  • DRM Exclusive
    • News
    • Court Updates
    • Features
    • Comment
    • Campaigns
      • #PrivacyHumSabKe
    • Vodcasts
  • In Media
    • News
    • OP-EDs
  • Editorial
  • Gender & Tech
    • SheConnects
  • Trends Monitor
  • Infographics
  • Resources
    • Laws and Policies
    • Research
    • International Frameworks
  • DRM Advocacy
    • Exclusives
    • Featured
    • Publications
    • Statements
No Result
View All Result
Digital Rights Monitor
  • DRM Exclusive
    • News
    • Court Updates
    • Features
    • Comment
    • Campaigns
      • #PrivacyHumSabKe
    • Vodcasts
  • In Media
    • News
    • OP-EDs
  • Editorial
  • Gender & Tech
    • SheConnects
  • Trends Monitor
  • Infographics
  • Resources
    • Laws and Policies
    • Research
    • International Frameworks
  • DRM Advocacy
    • Exclusives
    • Featured
    • Publications
    • Statements
No Result
View All Result
Digital Rights Monitor
No Result
View All Result

in DRM Exclusive, News

Meta, Snap directed to explain child protection measures

DRMby DRM
November 13, 2023
Meta, Snap directed to explain child protection measures

Image: Online

The European Union (EU) has directed Meta, the parent company of Facebook and Instagram, and Snap, which owns Snapchat, to provide details on the measures taken by them to protect children, according to the European Commission.

Both companies must submit comprehensive responses to the EU by December 1 on how they protect children from potentially harmful and illegal content.

The development comes amid heightened scrutiny of social media platforms by state regulators in relation to child protection. Leading platforms, including TikTok and YouTube, have become the focus of child protection watchdogs in both Europe and the United States.

Last week, YouTube and TikTok were sent requests by the EU to submit details regarding their measures to ensure child protection online. The video platforms have until November 30 to respond to the notices.

The regulator has ramped up its scrutiny of tech platforms after Europe’s new set of cyber rules, the Digital Services Act, came into effect in August, 2023. The law applies to a range of social media platforms, with those having over 45 million monthly active users labelled as “very large platforms” under the legislation.

If the Commission is not satisfied with a given company’s response, it can launch an investigation into its practices. Under the DSA, tech platforms that stand in violation of the rules may face a permanent ban and a fine amounting to up to 6 six per cent of their global turnover.

Meta, in particular, has been facing critical accusations of prioritising business over the safety of children. The revelations that began with the high-profile 2021 testimony of the whistleblower, Frances Haugen, were echoed in the disclosure made by another former Meta employee this month. The whistleblower, named Arturo Bejar, testified before the US lawmakers that Meta was aware of the harassment and harm facing children on Instagram, but the company did not take any meaningful measures to address the problem.

Haugen’s claims, supplemented by internal company documents, too revolved around the allegedly detrimental practices at Instagram and how the platform continues with its practices that have visibly proven to demonstrate a harmful impact on younger audiences on the app, which is highly popular among users aged between 13 and 17.

The internal documents also revealed Meta’s discriminatory content moderation practices, with 87 per cent of the company’s overall budget on classifying misinformation reportedly allocated for the US alone.

Tags: Child SafetyMetaSnap
Previous Post

Meta failed to protect children, former employee testifies

Next Post

X has fewer content moderators than competitors, says EU

Share on FacebookShare on Twitter
PTA denies role in massive data leak, says 1,372 sites blocked

PTA denies role in massive data leak, says 1,372 sites blocked

September 11, 2025
Khyber Pakhtunkhwa police crack down on TikTokers for ‘promoting obscenity’

Khyber Pakhtunkhwa police crack down on TikTokers for ‘promoting obscenity’

September 11, 2025
Afghan refugee children at Girdi Jungle refugee camp. Photo credits: Ramna Saeed

Pakistan blocks SIMS of Afghan refugees after deportation deadline

September 9, 2025
No Content Available

Next Post
X scraps election integrity team

X has fewer content moderators than competitors, says EU

About Digital Rights Monitor

This website reports on digital rights and internet governance issues in Pakistan and collates related resources and publications. The site is a part of Media Matters for Democracy’s Report Digital Rights initiative that aims to improve reporting on digital rights issues through engagement with media outlets and journalists.

About Media Matters for Democracy

Media Matters for Democracy is a Pakistan based not-for-profit geared towards independent journalism and media and digital rights advocacy. Founded by a group of journalists, MMfD works for innovation in media and journalism through the use of technology, research, and advocacy on media and internet related issues. MMfD works to ensure that expression and information rights and freedoms are protected in Pakistan.

Follow Us on Twitter

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

No Result
View All Result
  • DRM Exclusive
    • News
    • Court Updates
    • Features
    • Comment
    • Campaigns
      • #PrivacyHumSabKe
    • Vodcasts
  • In Media
    • News
    • OP-EDs
  • Editorial
  • Gender & Tech
    • SheConnects
  • Trends Monitor
  • Infographics
  • Resources
    • Laws and Policies
    • Research
    • International Frameworks
  • DRM Advocacy
    • Exclusives
    • Featured
    • Publications
    • Statements