Meta, the parent company of Facebook and Instagram, was aware of the harassment children faced on its social media platforms, but it did not do enough to counter the harm, a former Meta employee testified before the US lawmakers.
The whistleblower, Arturo Bejar, worked as a director of engineering for Meta’s Protect and Care team from 2009 to 2015. He returned to the company in 2019 and stayed until 2021. Bejar said he was responsible for maintaining users’ wellbeing and countering online harassment.
The testimony, given to the US Congress this week, focused on the content moderation practices at Instagram and dissected the company’s approach to managing potentially harmful content affecting teenagers. In 2021, Bejar said, he had sent an email to Meta CEO Mark Zuckerberg himself detailing the risks to children stemming from Instagram’s algorithms.
The email, which was also sent to other top executives, revealed based on internal data that about 51 per cent of Instagram users reported having an unpleasant or harmful experience on the platform in the past seven days. Of these 51 per cent, 24.4 per cent of underage users — aged between 13 and 15 — reported encountering unwanted sexual advances. Bejar’s concerns and warnings, however, were never meaningfully addressed.
Bejar’s testimony revolved around his own daughter’s experience, who received unwanted sexual advances on Instagram. The whistleblower said the company failed to take any action after the issue was reported to Instagram. His daughter was 14 years old at the time.
“She and her friends began having awful experiences, including repeated unwanted sexual advances, harassment,” Bejar testified before the US Congress on Tuesday. “She reported these incidents to the company and it did nothing.”
Bejar stressed the need for Meta to revise its policies concerning the safety of children, particularly in case of harassment, unwanted sexual advances, and other forms of potentially harmful content. He claimed Meta has been aware all along of the harms its products are posing to young people, yet no meaningful approaches have been adopted to counter and prevent the spread of material undermining children’s wellbeing.
“I can safely say that Meta’s executives knew the harm that teenagers were experiencing, that there were things that they could do that are very doable and that they chose not to do them,” Bejar told the press.
He was also of the opinion that Instagram should enable underage users to let the platform know that they don’t want to receive certain types of messages, including those with vulgar and sexually exploitative content.
Meta, on the other hand, said the company is committed to protecting the wellbeing of its users. “Every day countless people inside and outside of Meta are working on how to help keep young people safe online,” the company said in a statement. “All of this work continues.”
Bejar is not the first employee to testify against the tech giant before the US Congress. In 2021, former employee turned whistleblower Frances Haugen made headlines after she leaked to the press a trove of internal company documents. Her subsequent testimony to the lawmakers too revolved around the failure of Instagram to protect young users’ mental health, among other content moderation-related issues.