India’s IT ministry has sent notices to YouTube, X (formerly Twitter), and Telegram directing them to ensure the removal of content depicting child sexual abuse. The platforms have been warned of strict action in case of violation.
The government states that if social media companies fail to comply with the instructions, they may lose their protection from legal liability in the South Asian market. The notices stress the need for swift action against child sexual abuse material and call on tech companies to make sure such content is timely and permanently removed from their online platforms.
Responding to the Indian government’s notice, Telegram says its moderators actively patrol public parts of the platform and accept reports in order to remove content that breaches its terms. YouTube’s parent company, Google, on the other hand, says it has a zero tolerance policy on child sexual abuse material and that such content, in no form, is acceptable.
“We have heavily invested in the technology and teams to fight child sexual abuse and exploitation online and take swift to remove it as quickly as possible,” says a YouTube spokesperson. YouTube removed over 94,000 channels and more than 2.5 million videos that violated its child safety policies in the first quarter of 2023, according to the video-sharing platform, according to the statement.
While YouTube and Telegram have responded to the notices outlining their approach to managing harmful and illegal content targeting minors, there has been no comment from X. The popular tweeting platform is already on the radar of regulators and lawmakers for its negligence on several fronts, including disinformation.
The firms have also been directed to prevent the dissemination of child sexual abuse content through robust algorithmic changes and reporting mechanisms. “The rules under the IT Act lay down strict expectations from social media intermediaries that they should not allow criminal or harmful posts on their platforms,” the notice says, stressing the importance of disabling access to child sexual abuse material.
In the west, on the other hand, newly introduced legislation also imposes stricter regulations on tech firms with regards to child safety. The European Union’s Digital Safety Act (DSA) obliges internet companies to respond promptly to harmful and illegal content, particularly child sexual abuse material. In case of repeated violations, social media platforms could be fined six per cent of their global turnover and face permanent suspension in the region.