Meta cracks down on nudify apps after being exposed

Meta is suing a company that advertised generative AI apps on its social media platforms that enable users to “nudify” people without their consent. The lawsuit against Joy Timeline comes after hundreds of ads for the digital undressing apps were discovered on Meta’s Facebook, Messenger, Instagram, and Threads platforms by a CBS News investigation published last week.
“This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it,” Meta said. “We’ll continue to take the necessary steps – which could include legal action – against those who abuse our platforms like this.”
Meta’s lawsuit specifically aims to prevent Hong Kong-based Joy Timeline from listing its ads for CrushAI nudify apps across its social media platforms, after the company made “multiple attempts… to circumvent Meta’s ad review process.”
The legal action comes on the heels of a recently published CBS News investigation that found hundreds of ads for nudify apps across Meta’s platforms. Meta told CBS at the time that it has since removed a number of these ads, deleted the accounts running them, and blocked the URLS associated with the nude deepfake apps, but said it’s becoming harder to enforce its policies as exploitative generative AI apps find new ways to evade detection. CBS’s report said that ads for AI deepfake nude tools could still be found on Instagram after Meta removed ads for apps that had been flagged by the investigation.
Source link