Meta announced on Monday that Facebook and Instagram are helping to found Take It Down, a new platform designed to prevent explicit images of underage people from spreading online.
According to a press release, Meta and the National Center for Missing and Exploited Children (NCMEC) teamed up to build a program that proactively searches for private images through the use of a unique hash code.
The program allows underage people to flag images of themselves that are being shared on the internet against their will, giving each have a unique digital code that can help tag the image and remove it from the internet.
“Take It Down assigns a unique hash value—a numerical code—to their image or video privately and directly from their own device,” the release said. “Once they submit the hash to NCMEC, companies like ours can use those hashes to find any copies of the image, take them down, and prevent the content from being posted on our apps in the future.”
Meta said the program is built in a way that “respects young peoples’ privacy and data security” by allowing users only to submit the hash code assigned to their image—rather than the image itself—to NCMEC.
According to NCMEC, the hash code is generated on users’ phones, keeping the image from being uploaded to any central NCMEC database.
Once users submit their hash code, companies can use those unique codes to find, take down, and prevent those images from being shared on their platforms.
The company said they were integrating the platform into Facebook and Instagram so people can easily access the program when reporting potentially violating content on their apps.
Take It Down builds off of Meta’s previous software, Stop NCII, which operates under a similar concept for adults. That program was launched in 2021 with more than 70 non-profits to curb “revenge porn” online.
The company did not respond to a request for comment from the Daily Dot.