Meta Platforms has announced a series of additional measures to protect young users from “age-inappropriate” content across its social media platforms.
Meta, which owns popular social networking sites such as Facebook and Instagram, has been facing intense regulatory scrutiny with regards to protective measures for underage users. The recent enforcement of stringent laws, including the Digital Services Act (DSA) in Europe, has placed increased pressure on the tech conglomerate to effectively counter potentially harmful and illegal content to ensure young people’s safety.
In a blog post published Tuesday, Meta says it will start hiding more types of content for teens on Instagram and Facebook, both of which have come under regulatory glare time and again for their failure to contain various categories of harmful content. Meta says it is placing underage users “into the most restrictive content control settings” by default on Facebook and Instagram and will also restrain additional terms in the Search option on Instagram.
The new restrictions primarily concern content with potentially detrimental effects, including material exploring topics such as self-harm, eating disorders, and suicide. Although Meta allows users to share content discussing their own struggles with suicide, self-harm, and eating disorders, its policy is not to recommend such content and, therefore, it has been exploring ways to make it harder to locate harmful content, Meta says.
“Now, when people search for terms related to suicide, self-harm and eating disorders, we’ll start hiding these related results and will direct them to expert resources for help,” says Meta. “We already hide results for suicide and self harm search terms that inherently break our rules and we’re extending this protection to include more terms.”
The update will be rolled out for Instagram and Facebook users in the coming weeks, the company says. In addition to restricting harmful content, underage users will be sent notifications that will encourage them to update their settings to “a more private experience” on Instagram.
“If teens choose to ‘Turn on recommended settings’, we will automatically change their settings to restrict who can repost their content, tag or mention them, or include their content in Reels Remixes,” Meta says. “We’ll also ensure only their followers can message them and help hide offensive comments.”
Meta has been on the radar of regulators in both the United States and Europe, where its business practices have become the focus intense debates on child safety and data privacy. In December 2023, Meta was hit with a $600 million lawsuit by the state of New Mexico, which accused it of enabling child sexual abuse on Facebook and Instagram. The firm also currently stands sued by 33 US states for allegedly contributing to a “mental health crisis” through its “addictive features”.