DeepNude app to 'undress' women shut down after furor
The application allowed users to virtually "undress" women using artificial intelligence
WASHINGTON:
The creators of an application allowing users to virtually "undress" women using artificial intelligence have shut it down after a social media uproar over its potential for abuse.
The creators of "DeepNude" said the software was launched several months ago for "entertainment" and that they "greatly underestimated" demand for the app.
"We never thought it would be viral and (that) we would not be able to control the traffic," the DeepNude creators, who listed their location as Estonia, said on Twitter.
Samsung’s new AI technology can create a video of anyone with a single picture
"Despite the safety measures adopted (watermarks), if 500,000 people use it, the probability that people will misuse it is too high. We don't want to make money this way."
Articles in The Washington Post, Vice and other media showed how the app could be used to take a photo of a clothed woman and transform that into a nude image, sparking outrage and renewed debate over nonconsensual pornography.
"This is a horrifically destructive invention and we hope to see you soon suffer consequences for your actions," tweeted the Cyber Civil Rights Initiative, a group that seeks protection against non-consensual and "revenge" porn.
Mary Anne Franks, a law professor and president of the CCRI, tweeted later, "It's good that it's been shut down, but this reasoning makes no sense. The app's INTENDED USE was to indulge the predatory and grotesque sexual fantasies of pathetic men."
'Deepfakes' pose conundrum for Facebook, Zuckerberg says
DeepNude offered a free version of the application as well as a paid version, and was the latest in a trend of "deepfake" technology that can be used to deceive or manipulate.
Although the app was shut down, critics expressed concern that some versions of the software remained available and would be abused.
"The #Deepnude app is out there now and will be used, despite the creator taking it off the market. If only there were a way to disable all the versions out there," CCRI tweeted.
The creators of an application allowing users to virtually "undress" women using artificial intelligence have shut it down after a social media uproar over its potential for abuse.
The creators of "DeepNude" said the software was launched several months ago for "entertainment" and that they "greatly underestimated" demand for the app.
"We never thought it would be viral and (that) we would not be able to control the traffic," the DeepNude creators, who listed their location as Estonia, said on Twitter.
Samsung’s new AI technology can create a video of anyone with a single picture
"Despite the safety measures adopted (watermarks), if 500,000 people use it, the probability that people will misuse it is too high. We don't want to make money this way."
Articles in The Washington Post, Vice and other media showed how the app could be used to take a photo of a clothed woman and transform that into a nude image, sparking outrage and renewed debate over nonconsensual pornography.
"This is a horrifically destructive invention and we hope to see you soon suffer consequences for your actions," tweeted the Cyber Civil Rights Initiative, a group that seeks protection against non-consensual and "revenge" porn.
Mary Anne Franks, a law professor and president of the CCRI, tweeted later, "It's good that it's been shut down, but this reasoning makes no sense. The app's INTENDED USE was to indulge the predatory and grotesque sexual fantasies of pathetic men."
'Deepfakes' pose conundrum for Facebook, Zuckerberg says
DeepNude offered a free version of the application as well as a paid version, and was the latest in a trend of "deepfake" technology that can be used to deceive or manipulate.
Although the app was shut down, critics expressed concern that some versions of the software remained available and would be abused.
"The #Deepnude app is out there now and will be used, despite the creator taking it off the market. If only there were a way to disable all the versions out there," CCRI tweeted.