Apple and Google have rules against “nudify” apps, but their app store search and advertising systems actually point users to them, a TTP investigation found.
Searches for terms like "nudify," "undress," and "deepnude" in the app stores produced multiple apps capable of digitally stripping the clothes off women in photos.
These apps can take images of real people and use AI to make them look naked, put them into pornographic videos, or turn them into sexually explicit chatbots.
Apple and Google ran ads for nudify apps in some of the search results, and the app stores even suggested additional nudify search terms through their autocomplete function.
The apps identified by TTP have been downloaded 483 million times and made more than $122 million in lifetime revenue, according to data compiled by a mobile analytics firm.
The investigation found 31 nudify apps that were rated suitable for minors, a notable finding given the growing number of sexual deepfake scandals in schools.
This report shares OONI data on the blocking of Telegram in Russia.
The European Union on Monday imposed sanctions against two China-based and one Iranian company for cyberattacks against EU member states.
The EU listed China-based Integrity Technology Group and Anxun Information Technology, and Iranian company Emennet Pasargad.
Encyclopedia Britannica and its Merriam-Webster subsidiary have sued OpenAI in Manhattan federal court for allegedly misusing their reference materials to train its artificial intelligence models.
Britannica said in the complaint, opens new tab filed on Friday that Microsoft-backed OpenAI used its online articles and encyclopedia and dictionary entries to teach its flagship chatbot ChatGPT to respond to human prompts and "cannibalized" Britannica's web traffic with AI-generated summaries of its content.