Apple and Google have rules against “nudify” apps, but their app store search and advertising systems actually point users to them, a TTP investigation found.
Searches for terms like "nudify," "undress," and "deepnude" in the app stores produced multiple apps capable of digitally stripping the clothes off women in photos.
These apps can take images of real people and use AI to make them look naked, put them into pornographic videos, or turn them into sexually explicit chatbots.
Apple and Google ran ads for nudify apps in some of the search results, and the app stores even suggested additional nudify search terms through their autocomplete function.
The apps identified by TTP have been downloaded 483 million times and made more than $122 million in lifetime revenue, according to data compiled by a mobile analytics firm.
The investigation found 31 nudify apps that were rated suitable for minors, a notable finding given the growing number of sexual deepfake scandals in schools.