Apple is finally taking some strict action on apps that claim to use AI technology to generate nude images of people who have not given their consent. Apple has removed at least three AI nude image generator apps who offered services like face swaps or ‘undressing' where AI was used to remove clothes in images of people. These deepfake apps gained popularity through Instagram ads. While Meta was quick to take down the ads redirecting users to download such apps, Apple took a while to respond. It was only after a report by 404 Media that Apple took steps.
As per Apple's app review guidelines, the company clearly states that no app will be allowed that promotes pornography or bullying in any form. The guidelines read, “Apps should not include content that is offensive, insensitive, upsetting, intended to disgust, in exceptionally poor taste, or just plain creepy. Apps with user-generated content or services that end up being used primarily for pornographic content, Chatroulette-style experiences, objectification of real people (e.g. “hot-or-not” voting), making physical threats, or bullying do not belong on the App Store and may be removed without notice.”
The investigation uncovered five such ads on Meta's Ad Library. Three of these ads specifically promoted apps available on the App Store. These apps offered features like inserting faces onto nude bodies or digitally removing clothing in photos using AI. This incident highlights ongoing concerns about AI-powered apps that create deepfakes on the App Store.
Recently, Union Minister of State for Electronics and Technology Rajeev Chandrasekhar shared his concerns about deepfakes made with AI apps. Deepfakes have increasingly become a major concern, with people falling victim to photos and videos that are edited or manipulated to make anyone say or do anything that they did not do in real life. The minister also talked about the ongoing legal battle between the New York Times and OpenAI regarding copyright issues.
Read more on tech.hindustantimes.com