D
Deleted member 484047
Guest
Well, this certainly isn't creepy as all hell.
It was one thing for people to toss a persons face onto another persons body. But this is taking it to a whole new level, and quite honestly I think the government REALLY needs to step up to the plate and deal with this before it gets out of hand.
I get that this version produces low resolution images that will be easy enough to debunk, but thats not really the point. We are now entering territory where anyone can simply grab a picture online and turn it into a realistic nude. Sure, people have been doing this for quite some time when it comes to celebrities, but imagine your average person having a picture saved from facebook, instagram, or twitter and having this done to them. Give it a few more months or years, and it's going to be capable to producing images that aren't obviously fake.
What really bothers me is that people aren't up in arms about this kind of technology. Every time that someone creates a new or better version of deepfaking, it barely even registers on the news. There needs to be pressure on government officials to put the kibosh on things like this, or even severe limitations, even if it may be to late now. I'm normally not one to be all for the government squashing innovation in technology, but something like this I think is starting to cross lines that aren't okay with me.
And this is the rub, as it actually does seem that deepfaking is both legal and illegal in several ways. So it needs to be straightened out in the legal system before anyone can even begin fighting it.
So what say you?
It was one thing for people to toss a persons face onto another persons body. But this is taking it to a whole new level, and quite honestly I think the government REALLY needs to step up to the plate and deal with this before it gets out of hand.
A new AI-powered software tool makes it easy for anyone to generate realistic nude images of women simply by feeding the program a picture of the intended target wearing clothes.
The app is called DeepNude and it’s the latest example of AI-generated deepfakes being used to create compromising images of unsuspecting women. The software was first spotted by Motherboard’s Samantha Cole, and is available to download free for Windows, with a premium version that offers better resolution output images available for $99.
Both the free and premium versions of the app add watermarks to the AI-generated nudes that clearly identify them as “fake.” But in the images created by Motherboard, this watermark is easy to remove. (We were unable to test the app ourselves as the servers have apparently been overloaded.)
I get that this version produces low resolution images that will be easy enough to debunk, but thats not really the point. We are now entering territory where anyone can simply grab a picture online and turn it into a realistic nude. Sure, people have been doing this for quite some time when it comes to celebrities, but imagine your average person having a picture saved from facebook, instagram, or twitter and having this done to them. Give it a few more months or years, and it's going to be capable to producing images that aren't obviously fake.
What really bothers me is that people aren't up in arms about this kind of technology. Every time that someone creates a new or better version of deepfaking, it barely even registers on the news. There needs to be pressure on government officials to put the kibosh on things like this, or even severe limitations, even if it may be to late now. I'm normally not one to be all for the government squashing innovation in technology, but something like this I think is starting to cross lines that aren't okay with me.
Deepfakes do exist in a legal gray area though, with lawyers saying that AI-generated nudes could constitute defamation, but that removing them from the internet would be a possible violation of the First Amendment.
And this is the rub, as it actually does seem that deepfaking is both legal and illegal in several ways. So it needs to be straightened out in the legal system before anyone can even begin fighting it.
So what say you?