Meta is launching a new tool called Take It Down that will enable teenagers to prevent malicious spread of their private media and allow users to report child sexual exploitation on Facebook and Instagram.
The new tool, financed by Meta, was announced on Monday by the National Center for Missing & Exploited Children (NCMEC), a child protection nonprofit. It is aimed at curbing cybercrimes involving the publication of an individual’s explicit photos and videos across social media against their consent, commonly known as “revenge porn”. The crime has surged dramatically over the past few years, especially among young people, according to NCMEC.
“Having explicit content online can be scary and very traumatizing, especially for young people,” said Gavin Portnoy, communications vice president at NCMEC. “The adage of ‘you can’t take back what is already out there’ is something we want to change. The past does not define the future and help is available.”
The program’s initial infrastructure was financed by Meta and had a low-profile launch at the end of December 2022. As of Monday, Take It Down recorded more than 200 cases.
Take It Down will allow minors to report their private pictures to Facebook or Instagram by anonymously attaching a hash or a digital fingerprint to them from their devices. In this way, the photos or videos that are to be reported will not need to be uploaded to a new website. TakeItDown.NCMEC.org offers a software that can be installed on the user’s device to generate a hash for an explicit image. This software will store not the image but an anonymised number in a Meta database. The database will detect the image if it is published to Facebook or Instagram, and after matching it with the original photo, the platform will review and remove it.
“We created this system because many children are facing these desperate situations,” said NCMEC CEO Michelle DeLaune. “Our hope is that children become aware of this service, and they feel a sense of relief that tools exist to help take the images down. NCMEC is here to help.”
“This issue has been incredibly important to Meta for a very, very long time because the damage done is quite severe in the context of teens or adults,” said Antigone Davis, director for global safety at Meta. “It can do damage to their reputation and familial relationships, and puts them in a very vulnerable position. It’s important that we find tools like this to help them regain control of what can be a very difficult and devastating situation.”
Take It Down works for all unencrypted photos across Facebook, Instagram, Messenger and direct messages. The tool can also be used by parents or guardians on behalf of a minor. Facebook and Instagram users can also submit a report to help Meta remove explicit media depicting subjects under the age of 18.
In 2021, Meta launched a similar initiative called StopNCII (Stop Non-Consensual Intimate Image Abuse) to contain vindictive distribution of private media among adults.