Tech companies could face substantial fines for failure to remove content encouraging self-harm on their social networking platforms, according to a new plan by the UK government.
A proposed amendment to the UK’s Online Safety Bill seeks to criminalise the encouragement of self-harm on social media, classifying such material as illegal. The changes have been influenced by the case of 14-year-old Molly Russell, who ended her life in November 2017 after being exposed to online content depicting elements of self-harm and suicide.
The teenager consumed vast amounts of damaging material on online platforms, including Instagram and Pinterest, according to the investigations. Russell’s death became the subject of intense debate surrounding tech corporations and their deep influences when it was concluded that she “died from an act of self-harm while suffering from depression and the negative effects of online content”. The groundbreaking verdict was followed by calls for stricter action against tech corporations that allow harmful content to be openly viewed by anyone across their platforms.
The Online Safety Bill will be updated by UK’s Culture Secretary Michelle Donelan.
“I am determined that the abhorrent trolls encouraging the young and vulnerable to self-harm are brought to justice,” said Donelan. “So I am strengthening our online safety laws to make sure these vile acts are stamped out and the perpetrators face jail.”
The proposed amendment will require social media platforms such as Facebook, Twitter and Instagram to block self-harm material or they would be penalised. In case of violations, Ofcom, the communications regulator, will impose a fine worth up to 10 per cent of the company’s total revenues.
“Social media firms can no longer remain silent bystanders,” Donelan said. “They’ll face fines for allowing this abusive and destructive behaviour to continue on their platforms under our laws.”
The bill, which was introduced to the parliament in 2021, raised concerns that it would impact the right to free speech on online platforms and has been stalled twice. The criticism was directed at one of the provisions concerning “harmful but legal” material.
“From the evidence submitted to Molly Russell’s inquest in September, the ‘harmful but legal’ content probably did the most damage to Molly’s mental health,” said a spokesperson for Molly Rose Foundation, a charity for the late Russell. “Referring to one of the posts seen by Molly and cited in the inquest, they added: “Would this new offence prevent posts such as: ‘Who would love a suicidal girl?’ or would these continue to be spread by the social media tech platforms? It’s therefore important that other ‘harmful but legal’ content, of the type we know was harmful to Molly, is also within scope of the bill.”
According to reports, six months before her death, Molly interacted with 16,300 pieces of content on Instagram, of which about 2,100 were about self-harm and suicide. Additionally, she is said to have received recommendation emails bearing titles such as “10 depression pins you might like” on Pinterest.
Leading social media platforms such as Facebook, Instagram, and YouTube are rife with material depicting self-harm, despite repeated calls for effective regulation of online content. The problem is exacerbated by the algorithms of search engines, which make content recommendations according to the user’s preferences. The challenges resulting from the unwillingness or failure of tech companies to manage potentially harmful content pose serious threats to both developed and developing markets. The only difference is the lack of market power and political economy that impedes platform accountability in developing countries, particularly those in the Global South. The absence of regulations and the state’s failure to comprehend the complexities and evolving trends in technology result in exclusion from the global digital community, setting back technology-fuelled progress and spurring a surge in various forms of abuses and harm.
The following report by Media Matters for Democracy (MMfD) analyses the impacts of Big Tech’s failure to ensure overall user wellbeing in the Pakistani context and shows how leading tech corporations neglect their South Asian markets, which gives rise to a different set of challenges and threats for young audiences. Read: Social Media in a Mental Health Dismissive Society.