Meta Platforms has apologised for reels containing disturbing content after a large number of users flagged them this week, according to reports by various international publications, including CNBC.
Several users noticed an unusual rise in content depicting graphic violence and sexual imagery in the “Reels” section of their Instagram accounts. Subsequently, parent company Meta was streamed with complaints about the insensitive content from users, who reported that graphic content continued to show up in their feed despite being flagged.
According to CNBC, the inundation of unsettling content, which violated Meta’s own content guidelines, was a result of an “error”. In a statement carried by the publication, a Meta spokesperson says, “We have fixed an error that caused some users to see content in their Instagram Reels feed that should not have been recommended. We apologize for the mistake.”
Users voiced their concerns regarding the unexpected influx of disturbing content on many social media platforms, including Facebook and X, where they questioned whether they were the only ones who were experiencing an unexplained change in their content recommendations.
Meta, in its “Violent and Graphic Content” policy, states that in order to protect its users, the company removes “content that is particularly violent or graphic, such as videos depicting dismemberment, visible innards or charred bodies”. Sadistic comments on images of human or animal suffering is also subject to removal, it adds.