Roblox, one of the world’s most popular gaming platforms with over 82 million daily users—many of them under 13—is facing a wave of lawsuits accusing it of failing to protect children from sexual predators.
The lawsuits say Roblox’s parental controls, age restrictions, and safety features repeatedly failed to stop adult predators from contacting and grooming minors on the platform. In many cases, abusers reportedly posed as children, gained victims’ trust on Roblox, and then moved conversations to apps like Discord or Snapchat to escalate the abuse—sometimes leading to devastating real-world consequences.
Suit filed after teen dies by suicide
One of the most shocking cases involves Ethan Dallas, a 15-year-old boy from California who died by suicide in April 2024.
His mother, Rebecca Dallas, filed a lawsuit on September 12, accusing Roblox and Discord of “recklessly and deceptively operating their businesses in a way that led to the sexual exploitation and suicide” of her son.
It was reported that Ethan had started playing Roblox at the age of nine with the permission of his parents. But according to the lawsuit, at age 12, he was groomed by an adult predator posing as a child. The predator even convinced Ethan to disable parental controls and move to Discord, where he began demanding explicit photos and threatening to share them if Ethan didn’t comply.
The complaint states that Ethan was left traumatised and, overwhelmed by fear and shame, took his own life. His family accuses both companies of failing to screen users, verify identities, or protect children from known dangers on their platforms.
Suit filed against sexual exploitation of 10-year-old
A lawsuit was filed against Roblox on August 15 alleging that a 10-year-old Oakland County, Michigan, girl was subjected to sexual exploitation due to a lack of safety measures by the website, CBS reported. The lawsuit was filed in a District Court in California, where the company is based, alleging that the platform is a “digital and real-life nightmare for kids” despite its advertising as safe for children.
According to the lawsuit, the young girl was targeted by a user posing as a child. The user sent the girl explicit messages and photos of himself. The lawsuit claims Roblox “misrepresented and concealed information about the pervasive predatory conduct its app enables.” It also accuses the platform of allowing sexual content in user-created games, despite marketing itself as safe for children.
It was also highlighted that the platform has allowed users to create scenarios where their “avatars” can engage in virtual sexual activity, exposing children to the acts.
Suit filed after abduction of 10-year-old girl
In April 2025, a California man was arrested for abducting a 10-year-old child he reportedly met on social media apps Roblox and Discord, CNN reported. The arrest came after the child went missing. Her parents then filed a case against the two apps for failing to enforce age verification or stop predators from contacting children despite public promises of safety.
Families bringing these lawsuits argue that Roblox and Discord knew about the dangers but chose profits over safety. They accuse the companies of wrongful death, fraud, negligence, and misrepresentation, demanding accountability, compensation, and meaningful safety reforms.
These lawsuits highlight growing concerns about child safety on online platforms. Families, advocacy groups, and even the state of Louisiana—which accuses Roblox of “lying about safety and profiting from child exploitation”—want stricter regulations and accountability for tech companies whose platforms attract millions of young users.
Meta’s child safety case
Recently, Meta whistleblowers came forward with accusations of a cover-up of harm to children on virtual reality devices and apps owned by Meta. They say the social media company, which owns Facebook and Instagram, and offers a line of VR headsets and games, deleted or doctored internal safety research that showed children being exposed to grooming, sexual harassment and violence in its 3D realms, The Guardian reported.
Cayce Savage, who used to work as a user experience researcher for Meta from 2019 to 2023, shared, in a congressional hearing last week, that Meta is uninterested or unwilling to listen to its users.
“When I was doing research to identify the harms that children were facing in VR which I had to be sneaky about because legal wouldn’t actually let me do it. I identified that Roblox, the app in VR, was being used by coordinated pedophile rings. They set up strip clubs and they pay children to strip and that Robux *money in Roblox) can be converted into real money. So I flagged this to Meta. I said under no circumstances should we host the app Roblox on their headset. You can now download it in their app store.”
Roblox’s response
It is important to note that Roblox is a game-creation platform that allows users to design their own games as well as play a wide variety of games created by other users. The platform hosts millions of user-created games and virtual worlds covering a wide variety of genres, from traditional racing and role-playing games to simulations and obstacle courses. The app allows users to buy, sell and create virtual items. It even includes social hangouts and free-form user creation experiences where you can text, voice chat and create with other people in real time. In response to mounting criticism, Roblox has begun rolling out stricter safety rules:
- Explicit content bans: Any content suggesting sexual activity is now strictly prohibited.
- Age and ID verification: Games with private spaces or adult themes are restricted to ID-verified users over 17.
- Game moderation: Creators must fill out safety questionnaires before publishing games; lying leads to bans.
- AI detection tools: New systems will automatically shut down servers with inappropriate content or repeat violations.
The story has been edited by Yasal Munim who works as Senior Manager Programs at Media Matters for Democracy.