Friday, October 10, 2025
Digital Rights Monitor
  • DRM Exclusive
    • News
    • Court Updates
    • Features
    • Comment
    • Campaigns
      • #PrivacyHumSabKe
    • Vodcasts
  • In Media
    • News
    • OP-EDs
  • Editorial
  • Gender & Tech
    • SheConnects
  • Trends Monitor
  • Infographics
  • Resources
    • Laws and Policies
    • Research
    • International Frameworks
  • DRM Advocacy
    • Exclusives
    • Featured
    • Publications
    • Statements
No Result
View All Result
Digital Rights Monitor
  • DRM Exclusive
    • News
    • Court Updates
    • Features
    • Comment
    • Campaigns
      • #PrivacyHumSabKe
    • Vodcasts
  • In Media
    • News
    • OP-EDs
  • Editorial
  • Gender & Tech
    • SheConnects
  • Trends Monitor
  • Infographics
  • Resources
    • Laws and Policies
    • Research
    • International Frameworks
  • DRM Advocacy
    • Exclusives
    • Featured
    • Publications
    • Statements
No Result
View All Result
Digital Rights Monitor
No Result
View All Result

in Top story

US families are suing Roblox for failing to protect child players from ‘sexual predators’

DRMby DRM
September 17, 2025
US families are suing Roblox for failing to protect child players from ‘sexual predators’

Roblox, one of the world’s most popular gaming platforms with over 82 million daily users—many of them under 13—is facing a wave of lawsuits accusing it of failing to protect children from sexual predators.

The lawsuits say Roblox’s parental controls, age restrictions, and safety features repeatedly failed to stop adult predators from contacting and grooming minors on the platform. In many cases, abusers reportedly posed as children, gained victims’ trust on Roblox, and then moved conversations to apps like Discord or Snapchat to escalate the abuse—sometimes leading to devastating real-world consequences.

Suit filed after teen dies by suicide 

One of the most shocking cases involves Ethan Dallas, a 15-year-old boy from California who died by suicide in April 2024.

His mother, Rebecca Dallas, filed a lawsuit on September 12, accusing Roblox and Discord of “recklessly and deceptively operating their businesses in a way that led to the sexual exploitation and suicide” of her son.

It was reported that Ethan had started playing Roblox at the age of nine with the permission of his parents. But according to the lawsuit, at age 12, he was groomed by an adult predator posing as a child. The predator even convinced Ethan to disable parental controls and move to Discord, where he began demanding explicit photos and threatening to share them if Ethan didn’t comply.

The complaint states that Ethan was left traumatised and, overwhelmed by fear and shame, took his own life. His family accuses both companies of failing to screen users, verify identities, or protect children from known dangers on their platforms.

Suit filed against sexual exploitation of 10-year-old 

A lawsuit was filed against Roblox on August 15 alleging that a 10-year-old Oakland County, Michigan, girl was subjected to sexual exploitation due to a lack of safety measures by the website, CBS reported. The lawsuit was filed in a District Court in California, where the company is based, alleging that the platform is a “digital and real-life nightmare for kids” despite its advertising as safe for children.

According to the lawsuit, the young girl was targeted by a user posing as a child. The user sent the girl explicit messages and photos of himself. The lawsuit claims Roblox “misrepresented and concealed information about the pervasive predatory conduct its app enables.” It also accuses the platform of allowing sexual content in user-created games, despite marketing itself as safe for children.

It was also highlighted that the platform has allowed users to create scenarios where their “avatars” can engage in virtual sexual activity, exposing children to the acts.

Suit filed after abduction of 10-year-old girl 

In April 2025, a California man was arrested for abducting a 10-year-old child he reportedly met on social media apps Roblox and Discord, CNN reported. The arrest came after the child went missing. Her parents then filed a case against the two apps for failing to enforce age verification or stop predators from contacting children despite public promises of safety.

Families bringing these lawsuits argue that Roblox and Discord knew about the dangers but chose profits over safety. They accuse the companies of wrongful death, fraud, negligence, and misrepresentation, demanding accountability, compensation, and meaningful safety reforms.

These lawsuits highlight growing concerns about child safety on online platforms. Families, advocacy groups, and even the state of Louisiana—which accuses Roblox of “lying about safety and profiting from child exploitation”—want stricter regulations and accountability for tech companies whose platforms attract millions of young users.

Meta’s child safety case 

Recently, Meta whistleblowers came forward with accusations of a cover-up of harm to children on virtual reality devices and apps owned by Meta. They say the social media company, which owns Facebook and Instagram, and offers a line of VR headsets and games, deleted or doctored internal safety research that showed children being exposed to grooming, sexual harassment and violence in its 3D realms, The Guardian reported. 

Cayce Savage, who used to work as a user experience researcher for Meta from 2019 to 2023, shared, in a congressional hearing last week, that Meta is uninterested or unwilling to listen to its users. 

“When I was doing research to identify the harms that children were facing in VR which I had to be sneaky about because legal wouldn’t actually let me do it. I identified that Roblox, the app in VR, was being used by coordinated pedophile rings. They set up strip clubs and they pay children to strip and that Robux *money in Roblox) can be converted into real money. So I flagged this to Meta. I said under no circumstances should we host the app Roblox on their headset. You can now download it in their app store.”

Roblox’s response 

It is important to note that Roblox is a game-creation platform that allows users to design their own games as well as play a wide variety of games created by other users. The platform hosts millions of user-created games and virtual worlds covering a wide variety of genres, from traditional racing and role-playing games to simulations and obstacle courses. The app allows users to buy, sell and create virtual items. It even includes social hangouts and free-form user creation experiences where you can text, voice chat and create with other people in real time. In response to mounting criticism, Roblox has begun rolling out stricter safety rules: 

  • Explicit content bans: Any content suggesting sexual activity is now strictly prohibited.
  • Age and ID verification: Games with private spaces or adult themes are restricted to ID-verified users over 17.
  • Game moderation: Creators must fill out safety questionnaires before publishing games; lying leads to bans.
  • AI detection tools: New systems will automatically shut down servers with inappropriate content or repeat violations.

 

 

The story has been edited by Yasal Munim who works as Senior Manager Programs at Media Matters for Democracy.

Tags: Child Safetyonline child safetyrobloxsexual exploitation
Previous Post

Islamabad court orders removal of PTA Chairman over ‘illegal appointment’

Next Post

Behind the avatar: Why women gamers in Pakistan hide behind male usernames

Share on FacebookShare on Twitter
Online Hate Isn’t Just Virtual for Transgender Women in Pakistan— It’s Lethal

Online Hate Isn’t Just Virtual for Transgender Women in Pakistan— It’s Lethal

October 10, 2025
INDIA: Internet suspended in Manipur as student protests turn violent

Interior Ministry orders shutdown of mobile services in twin cities

October 10, 2025
Lahore High Court dismisses petition to censor Netflix, Amazon Prime

Lahore High Court dismisses petition to censor Netflix, Amazon Prime

October 8, 2025
No Content Available

Next Post
Behind the avatar: Why women gamers in Pakistan hide behind male usernames

Behind the avatar: Why women gamers in Pakistan hide behind male usernames

About Digital Rights Monitor

This website reports on digital rights and internet governance issues in Pakistan and collates related resources and publications. The site is a part of Media Matters for Democracy’s Report Digital Rights initiative that aims to improve reporting on digital rights issues through engagement with media outlets and journalists.

About Media Matters for Democracy

Media Matters for Democracy is a Pakistan based not-for-profit geared towards independent journalism and media and digital rights advocacy. Founded by a group of journalists, MMfD works for innovation in media and journalism through the use of technology, research, and advocacy on media and internet related issues. MMfD works to ensure that expression and information rights and freedoms are protected in Pakistan.

Follow Us on Twitter

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

No Result
View All Result
  • DRM Exclusive
    • News
    • Court Updates
    • Features
    • Comment
    • Campaigns
      • #PrivacyHumSabKe
    • Vodcasts
  • In Media
    • News
    • OP-EDs
  • Editorial
  • Gender & Tech
    • SheConnects
  • Trends Monitor
  • Infographics
  • Resources
    • Laws and Policies
    • Research
    • International Frameworks
  • DRM Advocacy
    • Exclusives
    • Featured
    • Publications
    • Statements