The UK has urged Meta to halt end-to-end encryption on Instagram and Messenger citing concerns related to child abuse through online messaging.
A new government-led campaign suggests that encrypting messages would enable child abusers to “hide in the dark”. It has called on Meta for robust safety mechanisms to detect child abuse activity within messages before rolling out end-to-end encryption, a feature that secures messages and prevents third parties from accessing data between the sender and receiver. Encrypted messaging is one of the key features of WhatsApp, which is also owned by Meta.
The campaign, launched by the UK’s Home Office on Wednesday, urges Meta CEO Mark Zuckerberg to halt end-to-end encryption for Instagram and Messenger, which roughly have 35 million and 28 million active users across Britain, respectively. The campaign arrives a day after the Online Safety Bill sailed through the parliament. The legislation focuses on safety measures for children on online platforms and imposes stricter regulations on internet companies to combat harmful content.
“Meta has failed to provide assurances that they will keep their platforms safe from sickening abusers,” said Britain’s Home Secretary Suella Braverman. “They must develop appropriate safeguards to sit alongside their plans for end-to-end encryption.” Braverman added she would not compromise on child safety. The primary concerns raised by the government and various child safety advocates are that abusers may get away with grooming and sexual abuse of minors through encrypted messaging.
Meta, on the other hand, has maintained that encryption can help keep users safe from malicious actors. “We don’t think people want us reading their private messages so we have spent the last five years developing robust safety measures to prevent, detect and combat abuse while maintaining online security.” Meta restricts individuals over 19 from messaging underage users who are not in their following and claims to have invested more than $20 billion on safety and security across its platforms since 2016.
The Online Safety Bill has undergone significant revisions since it was proposed four years ago and now focuses heavily on child safety and illegal content. The legislation has attracted equal parts approval and criticism, with the latter chiefly from tech companies that accused the bill of threatening the right to free speech and expression on social media platforms. Meta too objected to the stringent regulations, saying they would undermine its end-to-end encryption on WhatsApp. Under the Online Safety Bill, UK’s media regulator Ofcom will slap financial penalties of up to $23 million on tech companies that fail to rapidly take down harmful or illegal content. The violations could also cost these firms 10 per cent of their annual global revenue.




