Saturday, May 27, 2023
Digital Rights Monitor
  • DRM Exclusive
    • News
    • Court Updates
    • Features
    • Opinion
    • Campaigns
      • #PrivacyHumSabKe
    • Vodcasts
  • In Media
    • News
    • OP-EDs
  • Editorial
  • Gender & Tech
    • SheConnects
  • Trends Monitor
  • Infographics
  • Resources
    • Laws and Policies
    • Research
    • International Frameworks
  • DRM Advocacy
    • Exclusives
    • Featured
    • Publications
    • Statements
No Result
View All Result
Digital Rights Monitor
  • DRM Exclusive
    • News
    • Court Updates
    • Features
    • Opinion
    • Campaigns
      • #PrivacyHumSabKe
    • Vodcasts
  • In Media
    • News
    • OP-EDs
  • Editorial
  • Gender & Tech
    • SheConnects
  • Trends Monitor
  • Infographics
  • Resources
    • Laws and Policies
    • Research
    • International Frameworks
  • DRM Advocacy
    • Exclusives
    • Featured
    • Publications
    • Statements
No Result
View All Result
Digital Rights Monitor
No Result
View All Result

in DRM Exclusive, News

Google, Twitter, TikTok ordered to explain handling of online child abuse

Usman ShahidbyUsman Shahid
February 23, 2023
Google, Twitter, TikTok ordered to explain handling of online child abuse

Photo: Getty

Google, Twitter and TikTok have received legal notices from Australian regulatory authorities ordering them to submit information on their efforts to contain child exploitation material.

The notices were issued on Wednesday by Australia’s e-Safety Commissioner Julie Inman Grant. Google, Twitter and TikTok will have to answer questions about their handling of online child abuse and blackmail. Twitch and Discord have also been served the notices.

The social networking platforms have 35 days to explain themselves. In case of failure to respond, the companies will be slapped with fines of up to $700,000 a day. 

“We’ve been asking a number of these platforms for literally years: what are you doing to proactively detect and remove child sexual abuse material?” said Inman Grant. “And we’ve gotten what I would describe, as, you know, not quite radical transparency.”

The move particularly brings to attention Twitter’s anti-exploitation policies following billionaire Elon Musk’s $44 billion takeover of the company in October 2022. Musk had pledged to put an end to child exploitation across Twitter, but his claims were shrouded by the massive layoffs that subsequently became the subject of intense media scrutiny and expert opinions. 

The commissioner remarked Musk’s Twitter will now have an opportunity to reveal its efforts to counter online child abuse. Inman Grant expressed concern over the treatment of harmful material on Twitter with several safety teams, responsible for ensuring protection for children, already laid off.

“Back in November, Twitter boss Elon Musk tweeted that addressing child exploitation was priority No 1 but we have not seen detail on how Twitter is delivering on that commitment,” she said. “We’ve also seen extensive job cuts to key trust and safety personnel across the company – the very people whose job it is to protect children – and we want to know how Twitter will tackle this problem going forward.”

The platforms will have to explain their detection mechanisms for child exploitation material, including during live streams. They must also clarify how algorithms could amplify the reach of illegal material as well as their responses to extortion attempts against children.

“The creation, dissemination and viewing of online child sexual abuse inflicts incalculable trauma and ruins lives. It is also illegal,” the commissioner said. “It is vital that tech companies take all the steps they reasonably can to remove this material from their platforms and services.”

In response to the notice, Samantha Yorke, Google’s senior manager for government affairs and public policy, said that child sexual abuse has no place on Google’s platforms.

“We utilise a range of industry-standard scanning techniques including hash-matching technology and artificial intelligence to identify and remove child sexual abuse material that has been uploaded to our services,” said Yorke.

TikTok stated the platform has “zero-tolerance approach to predatory behaviour and the dissemination of child sexual abuse material”.

“We have more than 40,000 safety professionals around the world who develop and enforce our policies, and build processes and technologies to detect, remove or restrict violative content at scale.”

Discord, on the other hand, confirmed that the company would respond to the e-safety commissioner’s demand.

“We have zero tolerance for content that threatens child safety online, and firmly believe this type of content does not have a place on our platform or anywhere in society,” a spokesperson said. “This is an area of critical importance for all of us at Discord, and we share the office’s commitment to creating a safe and positive experience online.”

In August 2022, Australia’s e-safety commission issued similar notices to Apple, Meta and Microsoft.

Tags: Child Abusesocial media
Previous Post

Twitter rolls out alerts for Community Notes

Next Post

Elon Musk’s Twitter lays off more workers

Share on FacebookShare on Twitter
Pakistan’s fledgling fact-checking industry struggles to gain footing

Independent journalism and money matters

May 13, 2023
PDEA condemns mobile internet suspension; calls for immediate restoration

PDEA condemns mobile internet suspension; calls for immediate restoration

May 13, 2023
Civil society, business community call on govt to restore internet

Civil society, business community call on govt to restore internet

May 11, 2023
No Content Available

Next Post
Musk relaunching Twitter’s blue check subscription on Nov 29

Elon Musk’s Twitter lays off more workers

About Digital Rights Monitor

This website reports on digital rights and internet governance issues in Pakistan and collates related resources and publications. The site is a part of Media Matters for Democracy’s Report Digital Rights initiative that aims to improve reporting on digital rights issues through engagement with media outlets and journalists.

About Media Matters for Democracy

Media Matters for Democracy is a Pakistan based not-for-profit geared towards independent journalism and media and digital rights advocacy. Founded by a group of journalists, MMfD works for innovation in media and journalism through the use of technology, research, and advocacy on media and internet related issues. MMfD works to ensure that expression and information rights and freedoms are protected in Pakistan.

Follow Us on Twitter

No Result
View All Result
  • DRM Exclusive
    • News
    • Court Updates
    • Features
    • Opinion
    • Campaigns
      • #PrivacyHumSabKe
    • Vodcasts
  • In Media
    • News
    • OP-EDs
  • Editorial
  • Gender & Tech
    • SheConnects
  • Trends Monitor
  • Infographics
  • Resources
    • Laws and Policies
    • Research
    • International Frameworks
  • DRM Advocacy
    • Exclusives
    • Featured
    • Publications
    • Statements

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist