- cross-posted to:
- technology@lemmit.online
- cross-posted to:
- technology@lemmit.online
A.I. Is Making the Sexual Exploitation of Girls Even Worse::Parents, schools and our laws need to catch up to technology, fast.
A.I. Is Making the Sexual Exploitation of Girls Even Worse::Parents, schools and our laws need to catch up to technology, fast.
Photoshop has existed for quite some time. Take photo, google naked body, paste face on body. The ai powered bit just makes it slightly easier. I don’t want a future where my device is locked down and surveiled to the point I can’t install what I want on it. Neither should the common man be excluded from taking advantage of these tools. This is a people problem. Maybe culture needs to change. Limit phone use in schools. Technical solutions will likely only bring worse problems. There are probably no lazy solutions here. This is not one of those problems you can just hand over to some company and tell them to figure it out.
Though I could get behind making it illegal to upload and store someone’s likeness unless explicit consent was given. That is long overdue. Though some big companies would not get behind that. So it would be a hard sell. In fact, I would like all personal data be illegal to store, trade and sell.
Slightly easier? That’s a one hell of an understatement. Have you ever used Stable Diffusion?
But many big companies would love it. Basically, it turns a likeness into intellectual property. Someone who pirates a movie, would also be pirating likenesses. The copyright industry would love it; the internet industry not so much,
Licensing their likeness - giving consent after receiving money, if you prefer - would also be a new income stream for celebrities. They could license their likeness for any movie or show and get a pretty penny without having to even show up. They would just be deep-faked onto some skilled, low-paid double.