Meta has said it has filed a lawsuit against a Hong Kong-based company behind ‘nudify’ apps, as part of its broader crackdown on AI tools that let users generate fake sexually explicit images of individuals without their consent.
The tech giant announced it has sued CrushAI app developer Joy Timeline HK Limited to prevent the company from advertising its products on Meta’s platforms.
“This follows multiple attempts by Joy Timeline HK Limited to circumvent Meta’s ad review process and continue placing these ads, after they were repeatedly removed for breaking our rules,” Meta said in a blog post published last week.
Story continues below this ad
The Facebook parent also said it has developed a new AI system capable of finding, detecting, and taking down ads for nudify apps and websites on its platforms more quickly.
“We’ve worked with external experts and our own specialist teams to expand the list of safety-related terms, phrases and emojis that our systems are trained to detect within these ads,” it said. “We’ve also applied the tactics we use to disrupt networks of coordinated inauthentic activity to find and remove networks of accounts operating these ads,” Meta added.
The move comes amid a surge in AI-powered ‘nudify’ apps on the internet. These apps use generative AI to turn full-clothed images into realistic nude images of victims. Reports have previously suggested that students learn about these nudify apps or websites through ads on Instagram and other social media platforms.
In addition, Meta’s announcement comes nearly a month after US President Donald Trump signed the landmark Take It Down Act into law. The new legislation makes it illegal to to share explicit images of individuals without their consent. Revenge porn as well as fake AI-generated sexual images are also covered under the scope of the new act.
Story continues below this ad
Meta said that over the past six months, its ‘expert teams’ have taken down four separate networks of accounts that sought to run ads promoting nudify apps on its platforms.
It also revealed that the bad actors behind these apps would evolve their tactics to avoid detection. “For example, some use benign imagery in their ads to avoid being caught by our nudity detection technology, while others quickly create new domain names to replace the websites we block,” Meta said.
The company further re-emphasised that its updated policies do not allow the promotion of nudify apps or similar services on its platforms. It also restricts search terms such as ‘nudify’, ‘undress’, and ‘delete clothing’ on Facebook and Instagram so they don’t show results.
Meta said it will start sharing information like URLs in order to enable other platforms to remove nudify-related content as well. So far, it has provided more than 3,800 URLs to tech companies that are part of the Tech Coalition’s Lantern programme. This is in addition to the signals about violating child safety activity, including sextortion, that are already shared by Meta.
Story continues below this ad
The company finally said it will continue to support legislation that requires app stores like Google Play Store and Apple App Store to verify a user’s age and, if the user is underage, obtain parental consent before downloading the app.
Such legislation has intensified the clash between app store operators (like Google and Apple) and major social media platforms (such as Meta, X, and Snap) over who is responsible for the online safety of young users. Both Utah and Texas have adopted similar legislation that puts the burden of responsibility on app stores.