Meta is seeking legislation for tech giants, primarily Google and Apple, that would require parental approval for app downloads by users aged under 16.
The proposal would place a significant part of the responsibility to prevent children from online harm on app store giants rather than on social media platforms alone. The development arrives as regulatory scrutiny around Meta mounts with regards to child protection measures.
Meta’s proposed measures to bring in parents to oversee app downloads by their children were unveiled by Antigone Davis, the company’s global head of safety. According to Davis, the process will not require a parent to provide their and their child’s potentially sensitive identification to apps that have “inconsistent security and privacy practices”.
“Parents should approve their teen’s app downloads, and we support federal legislation that requires app stores to get parents’ approval whenever their teens under 16 download apps,” Davis said in a blog post. “With this solution, when a teen wants to download an app, app stores would be required to notify their parents, much like when parents are notified if their teen attempts to make a purchase.”
In this way, parents will be able to decide if they want to allow a certain app to be downloaded by their child. They can verify the child’s age when setting up their phone, thereby avoiding the need to verify the age multiple times across various apps, the statement adds.
Meta, which owns Facebook and Instagram, is routinely criticised for lack of adequate measures to ensure child safety on its social media platforms and neglecting the harm to prioritise its business interests over user wellbeing.
The company has also faced intense scrutiny over critical disclosures made by former employees turned whistleblowers, who highlighted Meta’s alleged disregard for the safety of underage users. Early this month, Arturo Bejar, who worked as a director of engineering and wellbeing specialist at Meta, revealed that his concerns persisting from detrimental business practices at Instagram were never meaningfully addressed.
In 2021, another whistleblower Frances Haugen caused a stir after she leaked internal company documents to leading news publications. Haugen went on to testify before the US lawmakers on the potential failure of Instagram to protect young users’ mental health, among other content moderation-related issues.
The scrutiny around tech companies’ child safety mechanisms was further intensified after the recently introduced Digital Services Act (DSA) in the European Union (EU). The legislation demands stricter measures to control potentially harmful and illegal online material and could cost internet firms up to six per cent of their global turnover in case of violations.