X, previously known as Twitter, has reinstated 6,103 accounts previously banned for violating community guidelines in Australia since Elon Musk’s October 2022 takeover of the platform.
In a transparency report released by Australia’s e-Safety commissioner on Thursday, X disclosed the deep cuts made to the safety and public policy teams at the firm following Musk’s high-profile $44 billion acquisition.
The information was sought by the watchdog under the country’s Online Safety Act, which requires tech companies to explain how they manage hateful content on their social networking platforms.
X was sent a legal notice by the eSafety office to apprise it of the company’s approach to online hate in June 2023. The disclosures in the transparency report echo the concerns from digital rights advocates and tech critics persisting from Musk’s erratic and controversial decisions, which have led to radical changes at the company.
X’s global and safety staff has been reduced by a third, with an 80 per cent reduction in the number of safety engineers, according to the report. The content moderators directly employed by X have also been slashed by more than half. The number of global public policy staff have also been reduced by nearly 80 per cent, the report adds.
The platform has reinstated over 6,100 previously banned accounts, of which 194 were banned for serious breaches of rules related to online hate. The report raises concerns over X’s decision to reinstate these accounts without placing them under additional scrutiny given the record of their prior violations.
“It’s almost inevitable that any social media platform will become more toxic and less safe for users if you combine significant reductions to safety and local public policy personnel with thousands of account reinstatements of previously banned users,” said Australia’s eSafety Commissioner Julie Inman Grant.
“You’re really creating a bit of a perfect storm. A number of these reinstated users were previously banned for online hate.”
If X allows the “worst offenders” back in the absence of sufficient moderation and wellbeing staff, there are clear concerns regarding the implications for the safety of users, the commissioner added.