- cross-posted to:
- technology@lemmit.online
- cross-posted to:
- technology@lemmit.online
We Asked A.I. to Create the Joker. It Generated a Copyrighted Image.::Artists and researchers are exposing copyrighted material hidden within A.I. tools, raising fresh legal questions.
The problem in here is that while the Joker is a pretty recognizable cultural icon, somebody using an AI may have genuinely original idea for an image that just happens to have been independently developed by someone before. As a result, the AI can produce an image that’s a copy or close reproduction of an original artwork without disclosing its similarity to the source material. The new “author” then will unknowingly rip off the original.
The prompts to reproduce joker and other superhero movies were quite specific, but asking for “Animated Sponge” is pretty innocent. It is not unthinkable that someone may not be familiar with Mr. Squarepants and think they developed an original character using AI
That’s a good point. Musicians have been known to accidentally reproduce the same beat as another musician (was is done subconsciously or just coincidence?). Some books are strikingly similar to other books that it makes you wonder if it was a rip off or just coincidence. So it’s nothing new, but it may become more prevalent with AI. This could spawn a new industry of investigators ensuring your AI generated art isn’t infringing on any copyrights 🤔
It’s on the person using any AI tools to verify that they aren’t infringing on anything if they try to market/sell something generated by these tools.
That goes for using ChatGPT just as much as it goes for Midjourney/Dall-E 3, tools that create music, etc.
And you’re absolutely right, this is going to be a problem more and more for anyone using AI Tools and I’m curious to see how that will factor in to future lawsuits.
I could see some new factor for fair use being raised in court, or else taking this into account under one of the pre-existing factors.
This might be the best point I’ve seen around this topic – have not seen this addressed before.