Meta is suing a Hong Kong-based company for running ads on Facebook and Instagram to promote an app that creates non-consensual nude images using artificial intelligence. The lawsuit, filed on Thursday, June 12, targets Joy Timeline HK Limited, the developer of CrushAI, a so-called “nudify app”
What is CrushAI?
CrushAI allows users to upload a photo of someone and create nude or intimate images using AI technology.
Meta filed the suit in Hong Kong in an effort to block the company from advertising on its platforms.
Why is Meta taking legal action?
According to Meta, Joy Timeline repeatedly violated its ad rules, circumventing the platform’s review process even after previous ads were removed.
“This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it,” Meta said in a blog post. “We’ll continue to take the necessary steps — which could include legal action — against those who abuse our platforms like this.”
Prior warnings and political pressure
Researchers and lawmakers have long warned about the rise of “nudify apps,” which are readily available online, in app stores, and on social media advertising platforms.
In February, Sen. Dick Durbin, D-Ill., wrote a letter to Zuckerberg requesting his company crack down on CrushAI, citing research that showed more than 8,000 CrushAI-linked ads appeared on Meta platforms in just two weeks.
Meta’s new enforcement tools
Meta also announced stronger “enforcement methods,” including AI-based detection tools to flag inappropriate ads, even those without explicit content. It will also use pattern-matching tech to catch “copycat” ads and tactics borrowed from counter-disinformation efforts to dismantle ad networks promoting these apps.
Meta said it’s coordinating with outside and in-house “specialist teams” to monitor nudify apps as they “evolve their tactics to avoid detection.”
The company also plans to share data with other tech firms to help shut down these services across the digital ecosystem.
The broader ‘nudify’ problem
Meta’s action comes amid a surge in the use of nudify apps.
A 2024 report by Bellingcat’s Kolina Koltai linked the app ClothOff to over 9.4 million users in one quarter. It was also associated with high-profile cases of AI-generated child sexual abuse, including one incident at a U.S. school.
Research shows ClothOff was accessed by over 235,000 users through social media in just three months. Some of its promotional accounts on X, formerly Twitter, had hundreds of thousands of followers. One premium account had minimal engagement — a sign it was likely used primarily for advertising.
A Tech Transparency Project report found X failed to remove any reported deepfakes flagged as nonconsensual sexual content. The project also revealed that ClothOff recently moved to a new domain, as tracked by researchers on BlueSky.
A double standard?
Just one day before Meta’s lawsuit, AI Forensics released a report alleging Meta uses different standards when reviewing ads compared to organic content. Researchers say they successfully uploaded screenshots of flagged ads that had previously run on Meta and found the same content was immediately removed when posted organically.
The report accuses Meta of maintaining a “systemic double standard” and misleading EU regulators about its ad review systems in filings required by the Digital Services Act.
Click this link for the original source of this article.
Author: Craig Nigrelli
This content is courtesy of, and owned and copyrighted by, https://straightarrownews.com and its author. This content is made available by use of the public RSS feed offered by the host site and is used for educational purposes only. If you are the author or represent the host site and would like this content removed now and in the future, please contact USSANews.com using the email address in the Contact page found in the website menu.