Deepfake Nudes

Deepfake Nudes




🔞 ALL INFORMATION CLICK HERE 👈🏻👈🏻👈🏻

































Deepfake Nudes


Log in or sign up



Log In



Sign Up





Site search

Search
Search









Video








Amazon








Apple








Facebook








Google













Microsoft








Samsung








Tesla








AI








Cars













Cybersecurity








Mobile








Policy








Privacy








Scooters













Phones








Laptops








Headphones








Cameras













Tablets








Smartwatches








Speakers








Drones













Accessories








Buying Guides








How-tos








Deals













Video








Space








NASA








SpaceX













Health








Energy








Environment













YouTube








Instagram








Adobe













Kickstarter








Tumblr








Art Club













Cameras








Photography








What’s in your bag?













Film








TV








Games













Fortnite








Game of Thrones








Books













Comics








Music









Share
All sharing options for:
New AI deepfake app creates nude images of women in seconds











Linkedin
(opens in new window)












Reddit
(opens in new window)









Pocket
(opens in new window)











Flipboard
(opens in new window)







Email
(opens in new window)





The fake nudes aren’t perfect but could easily be mistaken for the real thing

By submitting your email, you agree to our Terms and Privacy Notice . You can opt out at any time. This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.





Terms of Use

Privacy Notice

Cookie Policy

Do Not Sell My Personal Info

Licensing FAQ

Accessibility

Platform Status





Contact

Tip Us

Community Guidelines

About

Ethics Statement






Vox Media Vox Media
Vox Media logo.


Advertise with us

Jobs @ Vox Media

© 2022 Vox Media , LLC. All Rights Reserved



We use cookies and other tracking technologies to improve your browsing experience on our site, show personalized content and targeted ads, analyze site traffic, and understand where our audiences come from. To learn more or opt-out, read our Cookie Policy . Please also read our Privacy Notice and Terms of Use , which became effective December 20, 2019.
By choosing I Accept , you consent to our use of cookies and other tracking technologies.
The resulting fakes could be used to shame, harass, and intimidate their targets
A new AI-powered software tool makes it easy for anyone to generate realistic nude images of women simply by feeding the program a picture of the intended target wearing clothes.
The app is called DeepNude and it’s the latest example of AI-generated deepfakes being used to create compromising images of unsuspecting women. The software was first spotted by Motherboard’s Samantha Cole , and is available to download free for Windows, with a premium version that offers better resolution output images available for $99.
Both the free and premium versions of the app add watermarks to the AI-generated nudes that clearly identify them as “fake.” But in the images created by Motherboard , this watermark is easy to remove. (We were unable to test the app ourselves as the servers have apparently been overloaded.)
As we’ve seen with previous examples of deepfake pornography, the quality of the output is varied. It’s certainly not photorealistic, and when examined closely the images are easy to spot as fake. The AI flesh is blurry and pixelated, and the process works best on high-resolution images when the target is already wearing revealing clothes like a swimsuit.
But at lower resolutions — or when seen only briefly — the fake images are easy to mistake for the real thing, and could cause untold damage to individuals’ lives.
Although much of the discussion around the potential harms of deepfakes has centered on political misinformation and propaganda, the use of this technology to target women has been a constant since its creation. Indeed, that was how the tech first spread , with users on Reddit adapting AI research published by academics to create fake celebrity pornography.
A recent report from HuffPost highlighted how being targeted by deepfake pornography and nudes can upend someone’s life. As with revenge porn, these images can be used to shame, harass, intimidate, and silence women. There are forums where men can pay experts to create deepfakes of co-workers, friends, or family members, but tools like DeepNude make it easy to create such images in private, and at the touch of a button.
Notably, the app is not capable of producing nude images of men. As reported by Motherboard , if you feed it a picture of a man it simply adds a vulva.
The creator of the DeepNude app, who identified himself as “Alberto,” told Motherboard that he was inspired by memories of old comic book adverts for “X-ray specs,” which promised they could be used to see through peoples’ clothes. “Like everyone, I was fascinated by the idea that they could really exist and this memory remained,” said Alberto.
He says that he is a “technology enthusiast” rather than a voyeur, and is motivated by curiosity and enthusiasm for AI, as well as a desire to see if he could make an “economic return” from his experiments.
Alberto says he considered the potential for harm caused by this software, but ultimately decided it was not a barrier. “I also said to myself: the technology is ready (within everyone’s reach),” Alberto told Motherboard . “So if someone has bad intentions, having DeepNude doesn’t change much ... If I don’t do it, someone else will do it in a year.”
We contacted Alberto to ask further questions, and he replied briefly, saying the app was created for fun and that he hadn’t expected it to become so popular. He again compared the software to Photoshop, saying this can be used to achieve the same results as DeepNude “after half hours of youtube tutorial.” He also added that if people started using the software for malicious purposes “we will quit it for sure.” (We followed up to ask what counted as a bad use-case, and how he would know, but have yet to receive an answer.)
One negative aspect that Alberto does seem to be worried about is the potential legal fallout, with the license agreement for DeepNude claiming that “every picture edited through this software is considered a fake parody,” and that the app is an “entertainment service” that does “not promote
Giantess Boru
Chloe Khan Uncensored
Kate Beckinsale Hot Pics

Report Page