Sunday, September 14, 2025
Digital Rights Monitor
  • DRM Exclusive
    • News
    • Court Updates
    • Features
    • Comment
    • Campaigns
      • #PrivacyHumSabKe
    • Vodcasts
  • In Media
    • News
    • OP-EDs
  • Editorial
  • Gender & Tech
    • SheConnects
  • Trends Monitor
  • Infographics
  • Resources
    • Laws and Policies
    • Research
    • International Frameworks
  • DRM Advocacy
    • Exclusives
    • Featured
    • Publications
    • Statements
No Result
View All Result
Digital Rights Monitor
  • DRM Exclusive
    • News
    • Court Updates
    • Features
    • Comment
    • Campaigns
      • #PrivacyHumSabKe
    • Vodcasts
  • In Media
    • News
    • OP-EDs
  • Editorial
  • Gender & Tech
    • SheConnects
  • Trends Monitor
  • Infographics
  • Resources
    • Laws and Policies
    • Research
    • International Frameworks
  • DRM Advocacy
    • Exclusives
    • Featured
    • Publications
    • Statements
No Result
View All Result
Digital Rights Monitor
No Result
View All Result

in DRM Exclusive

UK to use AI age checks to block children from adult websites

DRMby DRM
December 7, 2023
Illicit website offering mobile subscription, CNIC data found advertised on news platform

The UK’s telecoms regulator, Office of Communications or Ofcom, has rolled out guidance for what it calls “highly effective age checks” to prevent individuals under the age of 18 from accessing porn websites.

The new age verification methods, which incorporate artificial intelligence (AI) among other mechanisms, will place a significant portion of responsibility on adult websites to ensure their content is not accessed by minors.

The development arrives under the recently passed Online Safety Act, a highly contentious piece of legislation that set off a prolonged debate on online privacy among digital safety advocates, tech companies, and the UK government. 

The law requires digital platforms to implement stricter measures for the safety of children to protect them from online harm, including bullying and content promoting self-harm and eating disorders.

Ofcom says the “age assurance” processes, involving age verification, age estimation, or a combination of both, are highly effective at assessing whether or not a viewer on a given adult website is a child.

As for the incorporation of AI-based technology in child safety measures concerning explicit content, websites will have to set up robust age gates on their landing pages. To verify their age, a viewer will have to upload a selfie to the website, and artificial intelligence will then determine their age based on facial features. 

Other verification methods include uploading an image of ID documents such as passport or driver’s licence, and sharing banking information that can confirm a user’s age.

These age checks are not as simple as they sound, however. They involve what is being deemed a controversial approach to restricting children’s access to porn websites. For instance, a viewer might involve their bank to confirm their age to the platform (albeit with consent), or they may upload their ID and then provide an additional live photo for matching to confirm their identity. 

Users will also have the option to verify their age through their credit cards, i.e., the bank issuing approval for their customer that they are over 18 years of age. Verification can be provided through mobile network operator and digital wallets as well.

The regulator’s draft guidance will not allow adult websites to continue with “weaker” age checks, including self-declaration of age and warnings or disclaimers. Adult content must not be visible to a viewer before or during the age-checking process, the regulator says.

Privacy and security concerns

Ofcom says all the assurance methods outlined above will be subject to the UK’s privacy laws, but the approach itself raises a slew of questions with regards to data protection and other challenges persisting online. The primary concerns revolve around the potential exposure of sensitive information, as has been observed in various episodes of hacking where critical banking data ended up on malicious sites. 

Moreover, experts argue that uploading images to websites that are notorious for publishing and profiting from child sexual abuse will entail further risks for users. The situation becomes particularly alarming in the context of creative developments fuelled by generative artificial intelligence (AI). 

Questions are arising on the legitimacy of mechanisms by the government to ensure the protection of personal information that is provided to adult websites for age verification. How the regulators will make sure this information will not be misused by bad actors for malicious purposes, particularly blackmail, is also one of the major concerns.

Open Rights Group (ORG), a digital rights group in the UK, has expressed reservations regarding Ofcom’s proposed guidance. In a statement, ORG said: “Open Rights Group agrees that it is important that children are protected online; however, Ofcom’s proposed guidelines create serious risks to everyone’s privacy and security.”

Tags: Child Safety
Previous Post

Meta hit with $600m privacy breach lawsuit

Next Post

Facebook, Instagram accused of enabling child sexual abuse in new lawsuit

Share on FacebookShare on Twitter
PTA denies role in massive data leak, says 1,372 sites blocked

PTA denies role in massive data leak, says 1,372 sites blocked

September 11, 2025
Khyber Pakhtunkhwa police crack down on TikTokers for ‘promoting obscenity’

Khyber Pakhtunkhwa police crack down on TikTokers for ‘promoting obscenity’

September 11, 2025
Afghan refugee children at Girdi Jungle refugee camp. Photo credits: Ramna Saeed

Pakistan blocks SIMS of Afghan refugees after deportation deadline

September 9, 2025
No Content Available

Next Post
Meta sued over ‘youth mental health crisis’

Facebook, Instagram accused of enabling child sexual abuse in new lawsuit

About Digital Rights Monitor

This website reports on digital rights and internet governance issues in Pakistan and collates related resources and publications. The site is a part of Media Matters for Democracy’s Report Digital Rights initiative that aims to improve reporting on digital rights issues through engagement with media outlets and journalists.

About Media Matters for Democracy

Media Matters for Democracy is a Pakistan based not-for-profit geared towards independent journalism and media and digital rights advocacy. Founded by a group of journalists, MMfD works for innovation in media and journalism through the use of technology, research, and advocacy on media and internet related issues. MMfD works to ensure that expression and information rights and freedoms are protected in Pakistan.

Follow Us on Twitter

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

No Result
View All Result
  • DRM Exclusive
    • News
    • Court Updates
    • Features
    • Comment
    • Campaigns
      • #PrivacyHumSabKe
    • Vodcasts
  • In Media
    • News
    • OP-EDs
  • Editorial
  • Gender & Tech
    • SheConnects
  • Trends Monitor
  • Infographics
  • Resources
    • Laws and Policies
    • Research
    • International Frameworks
  • DRM Advocacy
    • Exclusives
    • Featured
    • Publications
    • Statements