Intellectual property is fake lmao. Train your AI on whatever you want
I agree, but only if it goes both ways. We should be allowed to use big corpo's IPs however we want.
You are allowed to use copyrighted content for training. I recommend reading this article by Kit Walsh, a senior staff attorney at the EFF if you haven't already. The EFF is a digital rights group who most recently won a historic case: border guards now need a warrant to search your phone.
I know. AI is capable of recreating many ideas it sees in the training data even if it doesn't recreate the exact images. For example, if you ask for Mario, you get Mario. Even if you can't use these images of Mario without committing copyright infringement, AI companies are allowed to sell you access to the AI and those images, thereby monetizing them. What I am saying is that if AI companies can do that, we should be allowed to use our own depictions of Mario that aren't AI generated however we want.
AI companies can sell you Mario pics, but you can't make a Mario fan game without hearing from Nintendo's lawyers. I think you should be allowed to.
The comparision doesn't work. Because the AI is replacing the pencil or other drawing tool. And we aren't saying pencil companies are selling you Mario pics because you can draw a Mario picture with a pencil either. Just because the process of how the drawing is made differs, doesn't change the concept behind it.
An AI tool that advertises Mario pcitures would break copyright/trademark laws and hear from Nintendo quickly.
Except that you interact with the "tool" in pretty much the same way you'd interact with a human that you're commissioning for art minus, a few pleasantries. A pencil doesn't know how to draw Mario.
AI tools implicitly advertise Mario pictures because you know that:
The AI was trained on lots of images, including Mario.
The AI can give you pictures of stuff it was trained on.
An animation studio commissioned to make a cartoon about Mario would still get in trouble, even if they had never explicitly advertised the ability to draw Mario.
I don't think how you interact with a tool matters. Typing what you want, drawing it yourself, or clicking through options is all the same. There are even other programs that allow you to draw by typing. They are way more difficult but again, I don't think the difficulty matters.
There are other tools that allow you to recreate copyrighted material fairly easily. Character creators being on the top of the list. Games like Sims are well known for having tons of Sims that are characters from copyrighted IP. Everyone can recreate Barbie or any Disney Princess in the Sims. Heck, you can even download pre made characters on the official mod site. Yet we aren't calling out the Sims for selling these characters. Because it doesn't make sense.
Just so we're clear, my position is that it should all be okay. Copyright infringement by copying ideas is a bullshit capitalist social construct.
I don't buy the pencil comparison. If I have a painting in my basement that has a distinctive style, but has never been digitized and trained upon, I'd wager you wouldn't be able to recreate neither that image nor it's style. What gives? Because AI is not a pencil but more like a data mixer you throw complete works in into and it spews out colllages. Maybe collages of very finely shredded pieces, to the point you could even tell, but pieces of original works nontheless. If you put any non-free works in it, they definitely contaminate the output, and so the act of putting them in in the first place should be a copyright violation in itself. The same as if I were to show you the image in question and you decided to recreate it, I can sue you and I will win.
That is a fundamental misunderstanding of how AI works. It does not shred the art and recreate things with the pieces. It doesn't even store the art in the algorithm. One of the biggest methods right now is basically taking an image of purely random pixels. You show it a piece of art with a whole lot of tags attached. It then semi-randomly changes pixel colors until it matches the training image. That set of instructions is associated with the tags, and the two are combined into a series of tiny weights that the randomizer uses. Then the next image modifies the weights. Then the next, then the next. It's all just teeny tiny modifications to random number generation. Even if you trained an AI on only a single image, it would be almost impossible for it to produce it again perfectly because each generation starts with a truly (as truly as a computer can get, an unweighted) random image of pixels. Even if you force fed it the same starting image of noise that it trained on, it is still only weighting random numbers and still probably won't create the original art, though it may be more or less undistinguishable at a glance.
AI is just another tool. Like many digital art tools before it, it has been maligned from the start. But the truth is what it produces is the issue, not how. Stealing others' art by manually reproducing it or using AI is just as bad. Using art you're familiar with to inspire your own creation, or using an AI trained on known art to make your own creation, should be fine.
As a side note because it wasn't too clear from your writing, but the weights are only tweaked a tiny tiny bit by each training image. Unless the trainer sees the same image a shitload of times (Mona Lisa, that one stock photo used to show off phone cases, etc) then the image can't be recreated by the AI at all. Elements of the image that are shared with lots of other images (shading style, poses, Mario's general character design, etc) could, but you're never getting that one original image or even any particular identifiable element from it out of the AI. The AI learns concepts and how they interact because the amount of influence it takes from each individual image and its caption is so incredibly tiny but it trains on hundreds of millions of images and captions. The goal of the AI image generation is to be able to create vast variety of images directed by prompts, and generating lots of images which directly resemble anything in the training set is undesirable, and in the field it's called over-fitting.
Anyways, the end result is that AI isn't photo-bashing, it's more like concept-bashing. And lots of methods exist now to better control the outputs, from ControlNet, to fine-tuning on a smaller set of images, to Dalle-3 which can follow complex natural language prompts better than older methods.
Regardless, lots of people find that training generative AI using a mass of otherwise copyrighted data (images, fan fiction, news articles, ebooks, what have you) without prior consent just really icky.
You show it a piece of art with a whole lot of tags attached. It then semi-randomly changes pixel colors until it matches the training image. That set of instructions is associated with the tags, and the two are combined into a series of tiny weights that the randomizer uses.
Anyways, the end result is that AI isn’t photo-bashing, it’s more like concept-bashing
That's what I've meant by "very finely shredded pieces". Ioversimplifed it, yes. But what I mean is that it's not literally taking a pixel off an image and putting it into output. But that using the original image in any way is just copying with extra steps.
Say, we forego AI entirely and talk real world copyright. If I were to record a movie theater screen with a camcorder, I would commit copyright infringement, even though it's transformed by my camera lens. Same as If I were to distribute the copyrighted work in a ZIP file, invert colors, or trace every frame and paint it with watercolors.
What if I was to distribute the work's name alongside it's SHA-1 hash? You might argue that such transformation destroys the original work and can no longer be used to retrieve the original and therefore should be legal. But, if that was the case, torrent site owners could sleep peacefully knowing that they are safe from prosecution. Real world has shown that it's not the case.
Now, what if we take some hashing function and brute force the seed until we get one which outputs the SHA-1's of certain works given their names. That'd be a terrible version of AI, acting exactly like an over-trained model would: spouting random numbers except for works it was "trained" upon. Is distributing such seed/weight a copyright violation? I'd argue that'd be an overly complicated way to conceal piracy, but yes, it would be. Because those seeds/weights are are still a based on the original works, even if not strictly a direct result of their transformation.
Anyways, the end result is that AI isn’t photo-bashing, it’s more like concept-bashing
Copying concepts is also a copyright infringement, though
Regardless, lots of people find that training generative AI using a mass of otherwise copyrighted data (images, fan fiction, news articles, ebooks, what have you) without prior consent just really icky.
It shouldn't be just "icky", it should be illegal and be prosecuted ASAP. The longer it goes on like this, the more the entire internet is going to be filled with those kind-of-copyrighted things, and eventually turn into a lawsuit shitstorm.
Heads up, this is a long fucking comment. I don't care if you love or hate AI art, what it represents, or how it's trained. I'm here to inform, refine your understanding of the tools (and how exactly that might fit in the current legal landscape), and nothing more. I make no judgements about whether you should or shouldn't like AI art or generative AI in general. You may disagree about some of the legal standpoints too, but please be aware of how the tools actually work because grossly oversimplifying them creates serious confusion and frustration when discussing it.
Just know that, because these tools are open source and publically available to use offline, Pandora's box has been opened.
copying concepts is also copyright infringement
Except it really isn't in many cases, and even in the cases where it could be, there can be rather important exceptions. How this all applies to AI tools/companies themselves is honestly still up for debate.
Copyright protects actual works (aka "specific expression"), not mere ideas.
The concept of a descending blocks puzzle game isn't copyrighted, but the very specific mechanics of Tetris are copyrighted. The concept of a cartoon mouse isn't copyrighted, but mickey mouse's visual design is. The concept of a brown haired girl with wolf ears/tail and red eyes is not copyrighted, but the exact depiction of Holo from Spice and Wolf is (though that's more complicated due to weaker trademark and stronger copyright laws in Japan). A particular chord progression is not copyrightable (or at least it shouldn't be) but a song or performance created with it is.
A mere concept is not copyrightable. Once the concept is specific enough and you have copyrighted visual depictions of it, then you start to run more into trademark law territory and start to gain a copyright case. I really feel like these cases are kinda exceptions though, at least for the core models like stable diffusion itself, because there's just so much existing art (both official and even moreso copyright/trademark infringing fan art) of characters like Mickey Mouse anyways.
The thing the AI does is distill concepts and interactions between concepts shared between many input images, and can do so in a generalized way that allows concepts never before seen together to be mixed together easily. You aren't getting transformations of specific images out of the AI, or even small pieces of each trained image, you're instead getting transformations of learned concepts shared across many many many works. This is why the shredding analogy just doesn't work. The AI generally doesn't, and is not designed to, mimic individual training images. A single image changes the weights of the AI by such a miniscule amount, and those exact same weights are also changed by many other images the AI trains on. Generative AI is very distinctly different from tracing, or distributing mass information that's precisely specific enough to pirate content, or from transforming copyrighted works to make them less detectable.
To drive the point home, I'd like to expand on how the AI and its training is actually implemented, because I think that might clear some things up for anyone reading. I feel like the actual way in which the AI training uses images matters.
A diffusion model, which is what current AI art uses, is a giant neural network that we want to guess the noise pattern of an image. To train it on an image, we add some random amount of noise to the whole image (could be a small amount like film grain, or it could be enough to make the image completely noise, but it's random each time), then pass that image and its caption through the AI to get the noise pattern the AI guesses is in the image. Now we take the difference between the noise pattern it guessed and the noise pattern we actually added to the training image to calculate the error. Finally, we tweak the AI weights based on that error. Of note, we don't tweak the AI to perfectly guess the noise pattern or reduce the error to zero, we barely tweak the AI to guess ever so slightly better (like, 0.001% better). Because the AI is never supposed to see the same image many times, it has to learn to interpret the captions (and thus concepts) provided alongside each image to direct its noise guesses. The AI still ends up being really bad at guessing high noise or completely random noise anyways, which is yet another reason why it can't generally reproduce existing trained images from nothing.
Now let's talk about generation (aka "inference"). So we have an AI that's decent at guessing noise patterns in existing images as long as we provide captions. This works even for images that it didn't train on. That's great for denoising and upscaling existing images, but how do we get it to generate new unique images? By asking it to denoise random noise and giving it a caption! It's still really shitty at this though, the image just looks like some blobby splotches of color with no form, else it probably wouldn't work at denoising existing images anyways. We have a hack though: add some random noise back into the generated image and send it through the AI again. Every time we do this, the image gets sharper and more refined, and looks more and more like the caption we provided. After doing this 10-20 times we end up with a completely original image that isn't identifiable in the training set but looks conceptually similar to existing images that share similar concepts. The AI has learned not to copy images while training, but actually learned visual concepts. Concepts which are generally not copyrighted. Some very specific depictions which it learns are technically copyrighted, i.e. Mickey Mouse's character design, but the problem with that claim too is that there are fair use exceptions, legitimate use cases, which can often cover someone who uses the AI in this capacity (parody, educational, not for profit, etc). Whether providing a tool that can just straight up allow anyone to create infringing depictions of common characters or designs is legal is up for debate, but when you use generative AI it's up to you to know the legality of publishing the content you create with it, just like with hand made art. And besides, if you ask an AI model or another artist to draw Mickey mouse for you, you know what you're asking for, it's not a surprise, and many artists would be happy to oblige so long as their work doesn't get construed as official Disney company art. (I guess that's sorta a point of contention about this whole topic though isn't it? If artists could get takedowns on their mickey mouse art, why wouldn't an AI model get takedowns too for trivially being able to create it?)
Anyways, if you want this sort of training or model release to be a copyright violation, as many do, I'm unconvinced current copyright/IP laws could handle it gracefully, because even if the precise method by which AI's and humans learn and execute is different, the end result is basically the same. We have to draw new more specific lines on what is and isn't allowed, decide how AI tools should be regulated while taking care not to harm real artists, and few will agree on where the lines should be drawn.
Also though, Stable Diffusion and its many many descendents are already released publicly and open source (same with Llama for text generation), and it's been disseminated to so many people that you can no longer stop it from existing. That fact doesn't give StabilityAI a pass, nor do other AI companies who keep their models private get a pass, but it's still worth remembering that Pandora's box has already been opened.
I agree.
The problem is that you might technically be allowed to, but that doesn't mean you have the funds to fight every court case from someone insisting that you can't or shouldn't be allowed to. There are some very deep pockets on both sides of this.
Let them come.
I like your moxy, soldier
What a tough guy you are, based AI artist fighting off all the artcucks like a boss 😎
That's not it going both ways. You shouldn't be allowed to use anyone's IP against the copyright holders wishes. Regardless of size.
Nah all information should be freely available to as many people as practically possible. Information is the most important part of being human. All copyright is inherently immoral.
I'd agree with you if people didn't need to earn money to live. You can't enact communism by destroying the supporters of it. I fully support communism in theory, we should strive for a community based government. I'm also a game developer and when I make things I need to be able to pay my bills because we still live in capitalism.
If copyright was vigorously enforced a lot more people would starve than would be fed
It's already vigorously enforced. Maybe you can expand on what you mean.
Go on etsy search mickey mouse and go report all the hundreds of artists whose whole career is violating the copyright of the most litigious company on the planet.
As long as I'm not pretending to be Nintendo, can you quantify how exactly releasing, for example, a fan Mario game is unethical? It having the potential to hurt their sales if you make a better game than them doesn't count because otherwise that would imply that out-competing anyone in any market must be unethical, which is absurd.
No, it's about if you make a game that's worse or off-brand. If you make a bunch of Mario games into horror games and then everyone thinks of horror when they think of Mario then good or bad, you've ruined their branding and image. Equally, if you make a bunch of trash and people see Mario as just a trash franchise (like how most people see Sonic games) then it ruins Nintendo's ability to capitalize on their own work.
No one is worried about a fan-made Mario game being better.
Wouldn't that only be a problem if you pretended to be Nintendo? There are a lot of fan Sonic games and I don't think it's affected Sonic's image very much. There's shit like Sonic.EXE, which is absolutely a horror game, but people still don't think of horror when they think of Sonic. The reason Sonic is a trash franchise is because of SEGA consistently releasing shitty Sonic games. Hell, some of the fan games actually do beat out the official games in quality and polish.
Hypothetically, assuming you're right and fan games actually do hurt their branding, there's still no reason for it to be illegal. We don't ban shitty mobile games for giving good mobile games a bad rep, and neither Nintendo nor mobile devs can claim libel or defamation.
Actually we do. If your game is so terrible, it can get removed from the Google Play store. It has to be really bad and they rarely do it. Steam does this as well. Epic avoids this by vetting the games they put on their platform site first. It's why the term asset flip exists.
Ban in a legal sense, not ban from a proprietary store.
Sure, that's fair. Either way bad or good, it's illegal to take someone's existing IP and add on to it. Good or bad. No one is legally banning games based on quality and if they were good or bad it doesn't matter. It matters that the original story owner has a vision and they have the right to make money off of their work without other people trying to take or add on to the story themselves. Brand fatigue is a real thing and while all of the CoD games are great and fairly high quality, the reason they get a bad rap is exactly brand fatigue. The Assassin's Creed games were the same for a bit but they resolved that by spreading releases out.
Either way, the arguments to let people just add on to what they want seems to fall flat. Why not just build a universe like Mario and call it Maryo? Or the great giana sisters? Why involve someone else's IP at all? because you are profiting off of the popularity that someone else built with quality products. In even a communist society I'd want that banned because it's lazy, misrepresents the original vision, and overall it's completely avoidable without a problem. In fact, why not force people to create new things instead of letting them be lazy and stealing the popularity of a well-made IP?
Brand fatigue is definitely a real thing, I will agree with you there. In the same boat, so is genre fatigue. If you play a lot of platformers, you'll eventually get bored and move on to something else. Nintendo clearly has no right to go after other platforming games that have copied from Mario's mechanics, so why should they have the right to go after games that use Mario's aesthetic or name?
Fan games are not normally morally wrong, but I do think it's kind of trashy to try and make money off of someone else's brand if you're not doing it out of passion. I just don't see it as a legal problem, much like how crappy off-brand ripoffs and lazily made games aren't a legal problem. It's also worth noting that the people making the most money off of the IP are just executives that likely had little to no part in creating the characters and care less about appreciating the artistic work than most fan game makers.
Look at Sonic P-06, a fan-made remake of Sonic '06. It's highly polished, has fleshed out features that never made it into the original game, and is absolutely a labor of love. The original came out a buggy pile of garbage because it was forced out by those aforementioned business-people before it was ready. Most companies' strict protection of their IP just prevent works of art like P-06 from seeing the light of day. I think that SEGA's no Sonic monetization policy being enshrined into law would be a reasonable compromise.
Nintendo clearly has no right to go after other platforming games that have copied from Mario’s mechanics, so why should they have the right to go after games that use Mario’s aesthetic or name?
This argument doesn't hold water to me. How are you conflating a brand with all games in a genre? IP and brands are a signifier of what to expect. Sherlock Holmes is a great example of what used to be an IP known for a very specific style of mystery story but now just means "genius problem solver maybe with a drug habit."
Fan games are not normally morally wrong, but I do think it’s kind of trashy to try and make money off of someone else’s brand if you’re not doing it out of passion. I just don’t see it as a legal problem
We are going to just have to agree to disagree. Spiritual successors happen all the time. The reason people make fan games is that there is a lexicon built into their project already. It's a shortcut instead of building (and considering) what is meaningful to your game in a lexicon. Additionally, a lot of people do not consider how their changes affect the lexicon throughout all games. So what you are left with is mostly people who don't truly understand, never talked to the creators, never worked with them, assume everything from a product perspective, pushing out something that adds to the brand without any true coherency or consideration of future titles.
I for one see it as wrong to attempt to take someone's work and not only pass it off as your own but also potentially break their ability to make future iterations of their work.
Look at Sonic P-06, a fan-made remake of Sonic ‘06. It’s highly polished, has fleshed out features that never made it into the original game, and is absolutely a labor of love.
Perhaps a labor of love. From my initial parsing of reviews, looks like they didn't attempt to change anything but kept close to the source material.
Most companies’ strict protection of their IP just prevent works of art like P-06 from seeing the light of day
I feel like that's okay. There are plenty of original and better ideas out there. It doesn't prevent things from being made. P-06 being "still terrible but better" doesn't really say anything. The fact that games like Black Mesa, Sonic Media, Skywind, and very specifically Zera: Myths Awaken also exists, a game that started as Spyro 4 and Activision sent a C&D, so they rebranded. These things are easily fixed and honestly, it's great if fans want to attempt to build a game off of someone's IP then ask the IP holder if they can continue with it. If not, they can just rebrand and create their own universe. Temtem is a great example of what happens when people are forced to create their own thing. It becomes more impactful and allows for a more interesting product. Not just a cookie cutter game.
I get what you're saying. The problem is that you can't argue your case just by giving examples of how IP enforcement can lead to people innovating more in the way that I can. The debate is asymmetrical because all I have to do is show that IP violation doesn't uniquely negatively affect IP holders in ways that legal activities can't, demonstrating precedent for that kind of competition/harm being legal. You need to justify forcibly imposing limitations on what people are allowed to do, which has a higher "burden of proof" if you get what I'm saying.
I don't think that's all you have to show. In fact I feel like a lot of your examples have missed the point entirely. The point isn't that there are other ways that could maybe impact the sales or recognition a game gets. This conversation also jumps between copyright and trademark protections. Copyright is about protecting actual assets. You take my assets and make something else with them without my consent, you've stolen my work. That's a bad, immoral thing. Equally ties to the copyright of characters and story. You take the building blocks I've made, you've stolen my work. Trademark is about protecting brand recogniztion and deals with IP violations.
With that covered, the point of copyright is to ensure the people who did the work get paid for their work. That large or small companies don't have to worry about someone stealing their works and allowing them to innovate.
So when you say, whatever people will just make another game in the genre, that's innovation. What if it's bad? Not everything is good, that's still positive innovation. You always mentioned large IPs but the truth is that the law is equal and that large corporations are already taking people's work without asking. We do not need more of that.
For fan made games, there isn't a huge point to do them without the blessing of the IP holder and you might point to large studios vs small fans but think of it in every scale. Especially middle to small studios being stolen from. It's just a fan game doesn't really hold water when it's potentially putting people out of business because of the issues I've already shown. Imagine if copyright wasn't enforced and someone just re-uploaded existing games. Where do you draw a line on the charges a fan game needs to make in order to qualify. I can tell you right now 99% of players wouldn't buy a game on steam if there was another fan game of it exactly but for free. So then you have to draw up all these lines that are frankly unfair to the creators. So just let them choose. Their works belong to them. Not to just people who might like the game.
So I don't see any point for which a looser copyright law would be overall more helpful to society. We need courts to allow for smaller creators to justify fair use but that doesn't cover anything we talked about.
"Artists don't deserve to profit off their own work" is stupid as shit. Complain about copyright abuse and lobbying a la Disney and I'll be right there with you, but people shouldn't have the right to take your work and profit off it without either your consent or paying you for it.
Artists and other creatives who actually do work to create art (not shitting out text into an image generator) should take every priority over AI "creators."
No you don't understand, the machine works exactly like a human brain! That makes stealing the work of others completely justifiable and not even really theft!
/s, bc apparently this community has a bunch of dumbass tech bros that genuinely think this
This but mostly unironically. And before you go Inzulting me I'm an artist myself and wouldn't be where I am if I wasn't allowed to learn from other people's art to teach myself.
Having a masters degree in machine learning doesn't make you an artist grow tf up xD
No, but drawing for last 13 years of my life does.
don't dismiss me, I like being spit on
And this, is a strawman. If this argument is being made, it's most likely because of their own misunderstanding of the subject. They are most likely trying to make the argument that the way biological neural networks and artificial neural networks 'learn' is similar. Which is true to a certain extent since one is derived from the other. There's a legitimate argument to be made that this inherently provides transformation, and it's exceptionally easy to see that in most unguided prompts.
I haven't seen your version of this argument being spoken around here at all. In fact it feels like a personal interpretation of someone who did not understand what someone else was trying to communicate to them. A shame to imply that's an argument people are regularly making.
Equating training AI to not being able to profit is stupid as shit and the same bullshit argument big companies use to say "we lost a bazillion dollars to people pursuing out software" someone training their AI on an art work (that is probably under a creative commons licence anyway) does suck money out of an artists pocket they would have otherwise made.
Artists and other creatives who actually do work to create art (not shitting out text into an image generator) should take every priority over AI "creators."
Why are you the one that gets to decide what is "work" to create art? Should digital artists not count because they are computer assisted, don't require as much skill and technique as "traditional" artists and use tools that are based on the work of others like, say, brush makers?
And the language you use shows that you're vindictive and angry.
Should digital artists not count because they are computer assisted, don't require as much skill and technique as "traditional" artists and use tools that are based on the work of others like, say, brush makers?
My brother in Christ, they didn't even allude to this, this is an entirely new thought.
Yeah no shit sherlok. I'm applying their flawed logic to other situations, where the conclusion is even more dumb so he can see that the logic doesn't work.
But leaping from 'copyright is made-up' to 'so artists should starve?' is totally reasonable discourse.
Explain what you think will happen if artists who depend on their art to make a living lose copyright protection.
Commissions, patronage, subscriptions, everything else rando digital artists do when any idiot can post a JPG everywhere and DMCA takedowns accomplish roughly dick.
Meanwhile - you wanna talk about people who've been fucked over by corporations that decide their original artwork is too close to something a dead guy made?
Or look into whether professional artists were having a good time, before all this? Intellectual property laws have funded and then effectively destroyed countless years of effort by artists who aren't even allowed to talk about it due to NDAs. CGI firms keep losing everything and going under while the movies they worked on make billions. The status quo is not all sunshine and rainbows. Pretending the choice is money versus nothing is deeply dishonest.
“Just internet beg” oh okay. Shows exactly how much you value the people making the art you want to feed into the instant gratification machine.
'How money?' Existing lucrative business ventures by quite a lot of artists. 'So beg!' Yeah you got me, how intellectually sincere, gold sticker, you can leave.
They said IP, IP protects artists from having their work stolen. The fact AI guzzlers are big mad that IP might apply to them too is irrelevant.
Digital artists do exactly as much work as traditional artists, comparing it to AI “art” from an AI “artist” is asinine. Do you actually think digital artists just type shit in and a 3D model appears or something?
And yeah I’m angry when my friends and family who make their living as actual artists, digital and traditional, have their work stolen or used without their permission. They aren’t fucking corporations making up numbers about lost sales, they’re spending weeks trying to get straight up stolen art mass printed on tshirts and mugs removed from online sale. They’re going outside and seeing their art on shit they’ve never sold. Almost none of them own a home or even make enough to not have a regular job, it’s literally taking money out of their pockets to steal their work. This is the shit you’re endorsing by shitting on the idea of IP.
Do you actually think artists using AI tools just type shit into the input and output decent art? It's still just a new, stronger digital tool. Many previous tools have been demonized, claiming they trivialize the work and people who used them were called hacks and lazy. Over time they get normalized.
And as far as training data being considered stealing IP, I don't buy it. I don't think anyone who's actually looked into what the training process is and understands it properly would either. For IP concerns, the output should be the only meaningful measure. It's just as shitty to copy art manually as it is to copy it with AI. Just because an AI used an art piece in training doesn't mean it infringed until someone tries to use it to copy it. Which, agreed, is a super shitty thing to do. But again, it's a tool, how it's used is more important than how it's made.
Lmao, I’ve used AI image generation, you’re not going to be able to convince me any skill was involved in what I made. The fact some people type a lot more and keep throwing shit at the wall to see what sticks doesn’t make it art or anything they’ve done with their own skill. The fact none of them can control what they’re making every time the sauce updates is proof of that.
If it’s so obviously not IP violating to train with it then I’m sure it’ll be totally fine if they train them without using artists’ work without permission, since it totally wasn’t relying on those IP violating images. Yet for some reason they fight this tooth and nail. 🤔
Except they totally could. But a data source of such size of material where everyone opted in to use for AI explicitly does not exist.
The reason they fight it is in part also because training such models isn't exactly free. The hardware that it's done on costs hundreds of thousands of dollars and must run for long periods. You would not just do that for the funsies unless you have to. And considering the data by all means seems to be collected in legal ways, they have cause to fight such cases.
It's a bit weird to use that as an argument to begin with since a party that knows they are at fault usually settles rather than fight on and incur more costs. It's almost as if they don't agree with your assertion that they needed permission, and that those imagines were IP violating 🤔
Oh no it costs money to use the art stealing machine to make uncopyrightable trash? 🥺
"But a data source of such size of material where everyone opted in to use for AI explicitly does not exist. "
Dang I wonder why 🤔🤔🤔🤔
Because AI wasn't a big thing before 2020, and no such permission in obtained material has been legally necessary so far (lawsuits are pending of course). If something has no incentive to exist it will not be created. There's plenty of ethical justifications why no such permission is needed as well.
Oh cool, you think art theft is ethically justified as long as it's a robot doing it
Oh cool, you think misrepresenting and overly simplifying other people's points of view and an accurate representation of how certain copyright laws work (even when that's an inconvenient truth) is ethically justified as long as I can tell my anti AI homies that I stood up for them by 'dunking' on a person arguing in good faith for them to fight the right battles, and not cling to false ideas which will lead them to suffer more in the long term and turn people who would support them against them by spouting easily disputed lies.
But sure, go ahead! I'm sure you'll change so many minds by immediately disregarding everything they say by putting them in a box of "thiefs" because they said something that didn't fit very specifically within your "Guidebook to hating anything related to AI".
Now back to a serious discussion if you're up for it. Creative freedom is built on the notion that ideas are the property of nobody, it is a requirement since every artist in existence has derived their work from the work of others. It's not even controversial, using your definition of stealing means all artists 'steal' from each other all the time, and nobody cares. But because a robot does it (despite that robot being in 100% control of the artist using it), it's suddenly the most outrageous thing.
I know for sure my ideas have been 'stolen' from my publicized works, but I understand I had no sole right to that idea to begin with. I can't copyright it. And if a 'thief' used those ideas in a transformative manner rather than create something that tries to recreate what I made (which would be actual infringement), they have every right to as without that right literally nobody would be allowed to make anything since everything we make is inspired by something that we don't hold a copyright over. Most of the people actually producing stuff that will be displayed publicly so other people will experience and pull it apart to learn from understand we have no right to those ideas to begin with, except in how we applied those ideas in a specific work.
Think of how much actual art you could've made in the time it took you to shill for the thing that's stealing art and fucking over creatives and using personal data to spy on people
Oh yeah, shame on me, spending a part of my day 'shilling' for myself and my friends and colleagues. And 'shilling' for a better future for us all by dissuading people from weaponizing bad arguments and misunderstandings to defend themselves, because that will not help them one bit. The latter part of you sentence is such utter nonsense that I don't even need to respond to that.
First of all, your second point is very sad to hear, but also a non-factor. You are aware people stole artwork before the advent of AI right? This has always been a problem with capitalism. It's very hard to get accountability unless you are some big shot with easy access to a lawyer at your disposal. It's always been shafting artists and those who do a lot of hard work.
I agree that artists deserve better and should gain more protections, but the unfortunate truth is that the wrong kind of response to AI could shaft them even more. Lets say inspiration could in some cases be ruled to be copyright infringement if the source of the inspiration could be reasonably traced back to another work. This could allow companies big companies like Disney an easier pathway to sue people for copyright infringement, after all your mind is forever tainted in their IP after you've watched a single Disney movie. Banning open source models from existing could also create a situation where the same big companies could create internal AI models from the art in their possession, but anyone with not enough materials could not. Which would mean that everyone but the people already taking advantage of artists will benefit from the existence of the technology.
I get that you want to speak up for your friends and family, and perhaps they do different work than I imagine, but do you actually talk to them about what they do in their work?
Because digital artist also use non-AI algorithms to generate meshes and images. (And yes, that could be summed down to 'type shit in and a 3D model appears')
They also use building blocks, prefabs, and use reference assets to create new unique assets.
And like all artists they do take (sometimes direct) inspiration from the ideas of others, as does the rest of humanity.
Some of the digital artists I know have embraced the technology and combined it with the rest of their skills to create new works more efficiently and reduce their workload. Either by being able to produce more, or being able to spend more time refining works. It's just a tool that has made their life easier.
All of that is completely irrelevant to the fact that image generators ARE NOT PEOPLE and the way that people are inspired by other works has absolutely fuck all to do with how these algorithms generate images. Ideas aren’t copyrightable but these algorithms don’t use ideas because they don’t think, they use images that they very often do not have a legal right to use. The idea that they are equivalent is a self serving lie from the people who want to drive up hype about this and sell you a subscription.
I watch my husband work every day as a professional artist and I can tell you he doesn’t use AI, nor do any of the artists I know; they universally hate it because they can tell exactly how and why the shit it makes is hideous. They spot generated images I can’t because they’re used to seeing how this stuff is made. The only thing remotely close to an algorithm that they use are tools like stroke smoothing, which itself is so far from image generation it would be an outright lie to equate them.
Companies aren’t using this technology to ease artist workloads, they’re using it to replace them. There’s a reason Hollywood fought the strike as hard as they did.
The fact they are not people does not mean they can't use the same legal justifications that humans use. The law can't think ahead. The justification is rather simple, the output is transformative. Humans are allowed to be inspired by other works because the ideas that make up such a concept can't be copyrighted because they can be applied to be transformative. If the human uses that idea to produce something that's not transformative, it's also infringement. AI currently falls into that same reasoning.
You call it a self serving lie, but I could easily say that about your arguments as well that you only have this opinion because you don't like AI (it seems). That's not constructive, and since I hope you care about artists as well, I implore you actually engage in good faith debate other than just assuming the other person must be lying because they don't share your opinion. You are also forgetting that the people that benefit from image generators are people. They are artists too. Most from before AI was a thing, and some because it became a thing.
Again, sorry to hear your husband feels that way. I feel he is doing himself a disservice to dismiss a new technology, as history has not been kind to those who oppose inevitable change, especially when there are no good non-emotional reasons against this new technology. Most companies have never cared about artists, that fact was true without AI, and that fact will remain true whatever happens. But if they replace their artists with AI they are fools, because the technology isn't that great on it's own (currently). You need human intervention to use AI to make high quality output, it's a tool, not the entire process.
The Hollywood strikes is a good example of what artists should be doing rather than making certain false claims online. Join a union, get protection for your craft. Just because something is legal doesn't mean you can't fight for your right to demand specific protections from your employer. But they do not affect laws, they are organizational. It has no ramifications on people not part of the guilds involved. If a company which while protecting their artists, allows them to use AI to accelerate their workflow, and comes out on top against the company that despite their best intentions, made their art department not as profitable anymore, that will also cause them to lose their jobs. Since AI is very likely not to go away completely even in the most optimistic of scenarios, it's eventually a worse situation than before.
And lastly, I guess your husband does different work than the digital artists I work with then. You have a ton of generation tools for meshes and textures. I also never equated it directly to AI, but you stated that they use no tools which do all the work for them (such as building a mesh for them), which is false. You wouldn't use the direct output of AI as well.
I implore you to look at "algorithmic art": https://en.wikipedia.org/wiki/Algorithmic_art
"I can" and "you can't" are not opposites.
I have no idea what you're trying to convey here.
“Artists don’t deserve to profit off their own work” is not what anyone said.
Even if people can just take your shit and profit off it - so can you.
This is not a complete rebuttal, but if you need the core fallacy spelled out, let's go slowly.
Why are you entitled to profit off the labor of someone else?
Why am I not entitled to profit from my own work building on stories I enjoy?
Isn't this a completely different conversation than the one we were having and kind of missing the point? Yes, imo you should be allowed to do that. Still, AI Companies are using the labor of millions of artist for free to train their AIs, which are then threatening to eliminate ways of these artist to gather income.
How is that related in any way to the ways that copyright has been exploited against fanmade art?
The conversation began with 'fuck copyright,' so no, restraints on new works are at least as relevant as money.
Restraints on derivative works seem directly relevant to railing against AI training. I don't think fanart goes on your side of the table. You're taking a stand for amateur references to immense professional works. Presumably on the basis that Disney can keep doing its thing no matter how many people draw their own weird Zootopia comics - yeah? I for one would argue the environment someone grew up in is fair game for them to build from, as much for Star Wars as for ancestral fairy tales.
The environment of the internet is a pile of everyone's JPGs. We think nothing of amateur galleries where people mimic popular styles, borrow characters, and draw frankly unreasonable quantities of low-quality pornography. I'm not up-in-arms about AI because it's more of the same. I barely understand the objection. An entity learned English from library books? Yeah, that's how everything that can read English learned to read English. You couldn't understand this sentence without exposure to countless examples of text that were not explicitly provided for your education. So if there's a pile of linear algebra that can emit drawings, and it was shaped by looking at a ton of drawings that other people posted-- then-- as opposed to what?
The economic concerns are simpler: Hollywood is doomed. This is refrigeration, and they sell ice for iceboxes. They imagine it's going to make ice much easier to sell, since they won't need people to harvest and import it. In reality their entire business model is fucked, because their customers also won't need people to harvest and import it.
If they don't need a studio full of artists to make a cartoon movie... neither do you. Neither do all the artists they cast off. We cannot be far from models that tween pretty damn well on their own, and can be guided to tween flawlessly. The near future is not about to have less human art. No more than when Flash obviated colored paint on clear plastic.
Whether or not that's going to make anyone appropriate amounts of money under late capitalism is another question entirely, but it's not like artists were famously well-off before, and in any case Disney delenda est.
Even if text to image generators are able to improve to be better than human artists, people won't stop making art just because a computer can do it faster.
But they will stop hiring artists, and that's more to the point of what they were saying. We're already seeing some jobs being replaced with algorithms (mostly stuff like shitty click bait journalism, but still), and art has long been considered a skill not worth paying for. In centuries past, art used to be something only the rich could afford. Now, people get upset if artists charge $60 for a commission.
The algorithms won't need to produce work better than we can, or even equal. It just needs to make stuff that seems value appropriate. People have already made algorithms to imitate certain popular artists' styles, and they've seen a hit to their income as a result. Why get a commission done from one of them when you can go online and get 50 for free that are kinda close, and then just pick the one you like.
People have been getting automated out of their jobs for well over a century. Technology shouldn't stop advancing; everyone should be compensated for the human labor saved through the use of automation.
Even if people can just take your shit and profit off it - so can you.
What entitles someone to take another person's work and profit off it?
No.
The subject is intellectual property, in general.
Why am I not entitled to share culture?
Why am I not entitled to create, if similarity exists?
Why does someone get to own an idea, just because they wrote it down first?
Is the world's copyright system flawed? Yes. Should it be completely removed? No, because otherwise a lot of creative branches would be unsustainable. Artists need money, musicians need money etc.
I'm not against copyright. I'm trying to guide this other use through why their post is a nonsensical response to someone who is.
If you don't want to defend what you said that's fine but I'm not going to pretend we were actually talking about something else. 🤷♂️ "Share culture" is not when you take someone else's drawing and dropship hundreds of shittily made tshirts on the Facebook marketplace. That's what IP protects artists from and fighting stuff like that takes up a stupid amount of time for anyone that isn't a corporation.
If you are creating that's absolutely fine. But shit you typed into AI isn't creating anything and literally couldn't exist without the people that actually create art.
So you still have no idea what I'm trying to convey here.
You're fixated on one aspect, and ignoring all other consequences. "If you are creating that’s absolutely fine" is NOT what any version of copyright law says. Not ever. So demanding an answer to an explanation of why your first response was a strawman is not the mic-drop you think it is.
Again:
“Artists don’t deserve to profit off their own work” is a position you made up. They're your words. It's a thing you, and you alone, have said. But that's never the same thing as whether anyone else can. This is such a basic 'not-all doesn't mean none' distinction, and it is the only reason I wrote the only words you chose to read.
Look at who they responded to. The statement is not a straan when the person that was responded to doesn't believe in respect the rights of creators.
As if copyright never censors creators.
Fuck your sophistry.
If you don't want to discuss this, leave.
No. Fuck that. I don't consent to my art or face being used to train AI. This is not about intellectual property, I feel my privacy violated and my efforts shat on.
Unless you have been locked in a sensory deprivation tank for your whole life, and have independently developed the English language, you too have learned from other people's content.
Well my knowledge can't be used as a tool of surveillance by the government and the corporations and I have my own feelings intent and everything in between. AI is artifical inteligence, Ai is not an artificial person. AI doesn't have thoughts, feelings or ideals. AI is a tool, an empty shell that is used to justify stealing data and survelience.
This very comment is a resource that government and corporations can use for surveillance and training.
AI doesn't have thoughts? We don't even know what a thought is.
We may not know what comprises A thought, but I think we know it's not matrix math. Which is basically all an LLM is
Hard disagree, the neural connections in the brain can be modeled with matrix math as well. Sure some people will be uncomfortable with that notion, especially if they believe in spiritual stuff outside physical reality. But if you're the type that thinks consciousness is an emergent phenomenon from our underlying biology then matrix math is a reasonable approach to representing states of our mind and what we call thoughts.
Yet we live in a world where people will profit from the work and creativity of others without paying any of it back to the creator. Creating something is work, and we don't live in a post-scarcity communist utopia. The issue is the "little guy" always getting fucked over in a system that's pay-to-play.
Donating effort to the greater good of society is commendable, but people also deserve to be compensated for their work. Devaluing the labor of small creators is scummy.
I'm working on a tabletop setting inspired by the media I consumed. If I choose to sell it, I'll be damned if I'm going to pay royalties to the publishers of every piece of media that inspired me.
If you were a robot that never needed to eat or sleep and could generate 10,000 tabletop RPGs an hour with little to no creative input then I might be worried about whether or not those media creators were compensated.
The efficiency something can be created with should have no bearing on whether someone gets paid royalties.
It absolutely should, especially when the "creator" is not a person. AI is not "inspired" by training data, and any comparisons to human artists being inspired by things they are exposed to are made out of ignorance of both the artistic process and how AI generates images and text.
It's impossible to make any comparison between how AI and how humans make decisions without understanding the nature of consciousness. Simply understanding how AI works isn't enough.
Are you seriously suggesting that human creativity works by learning to reduce the amount of random noise they output by mapping words to patterns?
Is that what they suggested? Or are you just wanting to be mad about something?
Fair, my point was: how is it important that we understand how consciousness works to see that the way consciousness creates Art is not very comparable to a machine recognizing patterns?
The commentor above has compared inspiration to the way AI Companies are using the labor of millions of artists for free. In this context I assume this is what they were hinting at when responding to "AI is not being inspired" with "we don't know how consciousness works"
Because one is a black box that very well may just be a much more advanced version of what current AI does. We don't know yet. It's possible that with the training of trillions of trillions of moments of experience a person has that an AI may be comparable.
I mean, the likelihood is basically zero, but it's impossible to prove the negative. At the end of the day, our brains are just messy turing machines with a lot of built in shortcuts. The only things that set us apart is how much more complicated they are, and how much more training data we provide it. Unless we can crack consciousness, it's very possible some day in the future we will build an incredibly rudimentary AGI without even realizing that it works the same way we do, scaled down. But without truly knowing how our own brain works fully, how can we begin to claim it does or doesn't work like something else?
None of which means it is impossible to determine whether or not an algorithm that couldn't exist without the work of countless artists should have the same IP rights as a human being making art (the answer is no).
When, not if, such a robot exists, do you imagine we'll pay everyone who'd ever published a roleplaying game? Like a basic income exclusively for people who did art before The Training?
Or should the people who want things that couldn't exist without magic content engines be denied, to protect some prior business model? Bear in mind this is the literal Luddite position. Weavers smashed looms. Centuries later, how much fabric in your life do you take for granted, thanks to that machinery?
'We have to stop this labor-saving technology because of capitalism' is a boneheaded way to deal with any conflict between labor-saving technology... and capitalism.
I know this is shocking to people who have never picked up a brush or written anything other than internet arguments after the state stopped mandating they do so when they graduated high school, but you can just create shit without AI. Literal children can do it.
And there is no such thing as something that “couldn’t exist” without the content stealing algorithm. There is nothing it can create that humans can’t, but humans can create things it can’t.
There’s also something hilarious about the idea that the real drag on the artistic process was the creating art part. God almighty, I’d rather be a Luddite than a Philistine.
Fuck off.
If that sounds needlessly blunt, no, it's far kinder than your vile opening insult, which you can't even keep straight - immediately noting that everyone, even children, has done an art at some point.
So maybe this discussion of wild new technology versus 17th-century law and the grindstone of capitalism isn't about individual moral failings of people who disagree with you.
Ooh, hit close to home to point out AI guzzlers are talentless hacks? Take a nap bro, you seem to need it.
Those 10000 tabletop RPGs will almost certainly be completely worthless on their own, but might contain some novel ideas. Until a human comes by and scours it for ideas and combines it. It could very well be that in the same time it could only create 1 coherent tabletop RPG idea.
Should be mentioned though, AIs don't run for free either, they cost quite a lot of electricity and require expensive hardware.
Then don't post your art or face publicly, I agree with you if it's obtained through malicious ways, but if you post it publicly than expect it to be used publicly
If you post your art publicly why should it be legal for Amazon to take it and sell it? You are deluding yourself if you believe AI having a get out of jail free card on IP infringement won't be just one more source of exploitation for corporations.
Taking it and selling it is obviously not legal, but taking it and using it for training data is a whole different thing.
Once a model has been trained the original data isn't used for shit, the output the model generates isn't your artwork anymore it isn't really anybody's.
Sure, with some careful prompts you can get the model to output something in your style fairly closely, but the outputting image isn't yours. It's whatever the model conjured up based on the prompt in your style. The resulting image is still original of the model
It's akin to someone downloading your art, tracing it over and over again till they learn the style and then going off to draw non-traced original art just in your style
"You don't understand, it's not infringement because we put it in a blender first" is why AI "art" keeps taking Ls in court.
Care to point to those Ls in court? Because as far as I'm aware they are mostly still ongoing and no legal consensus has been established.
It's also a bit disingenuous to boil his argument down to what you did, as that's obviously not what he said. You would most likely agree that a human would produce transformative works even when basing most of their knowledge on that of another (such as tracing).
Ideas are not copyrightable, and neither are styles. You could use that understanding of another's work to create forgeries of their work, but hence why there exist protections against forgeries. But just making something in the style of another does not constitute infringement.
"U.S. District Judge William Orrick said during a hearing in San Francisco on Wednesday that he was inclined to dismiss most of a lawsuit brought by a group of artists against generative artificial intelligence companies, though he would allow them to file a new complaint."
"The judge also said the artists were unlikely to succeed on their claim that images generated by the systems based on text prompts using their names violated their copyrights.
"I don't think the claim regarding output images is plausible at the moment, because there's no substantial similarity" between images created by the artists and the AI systems, Orrick said."
Hard to call that an L, so I'm eagerly awaiting them.
Biggest L: that shit ain’t copyrightable. A doodle I fart out on a sticky pad has more legal protection.
That's true, but the reason behind that is reasonable. There has been no human intervention, and so like photos taken by animals, they hold no copyright.
But that's not what you should be doing anyways, a tool must be used as part of a process, not as *the process*.
A court saying "nobody owns this" is closer to the blender argument than the ripoff argument.
If the large corporations can use IP to crush artists, artists might as well try to milk every cent they can from their labor. I dislike IP laws as well, and you can never use the masters' tools to dismantle their house, but you can sure as shit do damage and get money for yourself.
Luckily, AI aren't the master's tools, they're a public technology. That's why they're already trying their had at regulatory capture. Just like they're trying to destroy encryption. Support open source development, It's our only chance. Their AI will never work for us. John Carmack put it best.
AI algorithms aren't the masters tools in the sense that anyone can set up a model using free code on cheap tech, but they do require money to improve and produce quality products. The training data requires labor of artists to produce, the computers the model runs on require labor to make, and the electricity that allows the model to continuously improve requires, you guessed it, labor. Labor will always, and should always, be expensive, making AI most useful to those with money to spend.
IP is only one tool in the masters' tool box, but capitalism is their box, and the other forms of capital can be used to swing the public's tools far harder than a wage laborer.
Hence why it's important that open source models should keep the support of the public. If they have to close shop, what remains will be those made by tech giants. They will be censored, they will be neutered, except if you pay enough for them. The power should remain with the artists, and not those with the money.
Intellectual property is fake lmao. Train your AI on whatever you want
I agree, but only if it goes both ways. We should be allowed to use big corpo's IPs however we want.
You are allowed to use copyrighted content for training. I recommend reading this article by Kit Walsh, a senior staff attorney at the EFF if you haven't already. The EFF is a digital rights group who most recently won a historic case: border guards now need a warrant to search your phone.
I know. AI is capable of recreating many ideas it sees in the training data even if it doesn't recreate the exact images. For example, if you ask for Mario, you get Mario. Even if you can't use these images of Mario without committing copyright infringement, AI companies are allowed to sell you access to the AI and those images, thereby monetizing them. What I am saying is that if AI companies can do that, we should be allowed to use our own depictions of Mario that aren't AI generated however we want.
AI companies can sell you Mario pics, but you can't make a Mario fan game without hearing from Nintendo's lawyers. I think you should be allowed to.
The comparision doesn't work. Because the AI is replacing the pencil or other drawing tool. And we aren't saying pencil companies are selling you Mario pics because you can draw a Mario picture with a pencil either. Just because the process of how the drawing is made differs, doesn't change the concept behind it.
An AI tool that advertises Mario pcitures would break copyright/trademark laws and hear from Nintendo quickly.
Except that you interact with the "tool" in pretty much the same way you'd interact with a human that you're commissioning for art minus, a few pleasantries. A pencil doesn't know how to draw Mario.
AI tools implicitly advertise Mario pictures because you know that:
An animation studio commissioned to make a cartoon about Mario would still get in trouble, even if they had never explicitly advertised the ability to draw Mario.
I don't think how you interact with a tool matters. Typing what you want, drawing it yourself, or clicking through options is all the same. There are even other programs that allow you to draw by typing. They are way more difficult but again, I don't think the difficulty matters.
There are other tools that allow you to recreate copyrighted material fairly easily. Character creators being on the top of the list. Games like Sims are well known for having tons of Sims that are characters from copyrighted IP. Everyone can recreate Barbie or any Disney Princess in the Sims. Heck, you can even download pre made characters on the official mod site. Yet we aren't calling out the Sims for selling these characters. Because it doesn't make sense.
Just so we're clear, my position is that it should all be okay. Copyright infringement by copying ideas is a bullshit capitalist social construct.
I don't buy the pencil comparison. If I have a painting in my basement that has a distinctive style, but has never been digitized and trained upon, I'd wager you wouldn't be able to recreate neither that image nor it's style. What gives? Because AI is not a pencil but more like a data mixer you throw complete works in into and it spews out colllages. Maybe collages of very finely shredded pieces, to the point you could even tell, but pieces of original works nontheless. If you put any non-free works in it, they definitely contaminate the output, and so the act of putting them in in the first place should be a copyright violation in itself. The same as if I were to show you the image in question and you decided to recreate it, I can sue you and I will win.
That is a fundamental misunderstanding of how AI works. It does not shred the art and recreate things with the pieces. It doesn't even store the art in the algorithm. One of the biggest methods right now is basically taking an image of purely random pixels. You show it a piece of art with a whole lot of tags attached. It then semi-randomly changes pixel colors until it matches the training image. That set of instructions is associated with the tags, and the two are combined into a series of tiny weights that the randomizer uses. Then the next image modifies the weights. Then the next, then the next. It's all just teeny tiny modifications to random number generation. Even if you trained an AI on only a single image, it would be almost impossible for it to produce it again perfectly because each generation starts with a truly (as truly as a computer can get, an unweighted) random image of pixels. Even if you force fed it the same starting image of noise that it trained on, it is still only weighting random numbers and still probably won't create the original art, though it may be more or less undistinguishable at a glance.
AI is just another tool. Like many digital art tools before it, it has been maligned from the start. But the truth is what it produces is the issue, not how. Stealing others' art by manually reproducing it or using AI is just as bad. Using art you're familiar with to inspire your own creation, or using an AI trained on known art to make your own creation, should be fine.
As a side note because it wasn't too clear from your writing, but the weights are only tweaked a tiny tiny bit by each training image. Unless the trainer sees the same image a shitload of times (Mona Lisa, that one stock photo used to show off phone cases, etc) then the image can't be recreated by the AI at all. Elements of the image that are shared with lots of other images (shading style, poses, Mario's general character design, etc) could, but you're never getting that one original image or even any particular identifiable element from it out of the AI. The AI learns concepts and how they interact because the amount of influence it takes from each individual image and its caption is so incredibly tiny but it trains on hundreds of millions of images and captions. The goal of the AI image generation is to be able to create vast variety of images directed by prompts, and generating lots of images which directly resemble anything in the training set is undesirable, and in the field it's called over-fitting.
Anyways, the end result is that AI isn't photo-bashing, it's more like concept-bashing. And lots of methods exist now to better control the outputs, from ControlNet, to fine-tuning on a smaller set of images, to Dalle-3 which can follow complex natural language prompts better than older methods.
Regardless, lots of people find that training generative AI using a mass of otherwise copyrighted data (images, fan fiction, news articles, ebooks, what have you) without prior consent just really icky.
That's what I've meant by "very finely shredded pieces". Ioversimplifed it, yes. But what I mean is that it's not literally taking a pixel off an image and putting it into output. But that using the original image in any way is just copying with extra steps.
Say, we forego AI entirely and talk real world copyright. If I were to record a movie theater screen with a camcorder, I would commit copyright infringement, even though it's transformed by my camera lens. Same as If I were to distribute the copyrighted work in a ZIP file, invert colors, or trace every frame and paint it with watercolors.
What if I was to distribute the work's name alongside it's SHA-1 hash? You might argue that such transformation destroys the original work and can no longer be used to retrieve the original and therefore should be legal. But, if that was the case, torrent site owners could sleep peacefully knowing that they are safe from prosecution. Real world has shown that it's not the case.
Now, what if we take some hashing function and brute force the seed until we get one which outputs the SHA-1's of certain works given their names. That'd be a terrible version of AI, acting exactly like an over-trained model would: spouting random numbers except for works it was "trained" upon. Is distributing such seed/weight a copyright violation? I'd argue that'd be an overly complicated way to conceal piracy, but yes, it would be. Because those seeds/weights are are still a based on the original works, even if not strictly a direct result of their transformation.
Copying concepts is also a copyright infringement, though
It shouldn't be just "icky", it should be illegal and be prosecuted ASAP. The longer it goes on like this, the more the entire internet is going to be filled with those kind-of-copyrighted things, and eventually turn into a lawsuit shitstorm.
Heads up, this is a long fucking comment. I don't care if you love or hate AI art, what it represents, or how it's trained. I'm here to inform, refine your understanding of the tools (and how exactly that might fit in the current legal landscape), and nothing more. I make no judgements about whether you should or shouldn't like AI art or generative AI in general. You may disagree about some of the legal standpoints too, but please be aware of how the tools actually work because grossly oversimplifying them creates serious confusion and frustration when discussing it.
Just know that, because these tools are open source and publically available to use offline, Pandora's box has been opened.
Except it really isn't in many cases, and even in the cases where it could be, there can be rather important exceptions. How this all applies to AI tools/companies themselves is honestly still up for debate.
Copyright protects actual works (aka "specific expression"), not mere ideas.
The concept of a descending blocks puzzle game isn't copyrighted, but the very specific mechanics of Tetris are copyrighted. The concept of a cartoon mouse isn't copyrighted, but mickey mouse's visual design is. The concept of a brown haired girl with wolf ears/tail and red eyes is not copyrighted, but the exact depiction of Holo from Spice and Wolf is (though that's more complicated due to weaker trademark and stronger copyright laws in Japan). A particular chord progression is not copyrightable (or at least it shouldn't be) but a song or performance created with it is.
A mere concept is not copyrightable. Once the concept is specific enough and you have copyrighted visual depictions of it, then you start to run more into trademark law territory and start to gain a copyright case. I really feel like these cases are kinda exceptions though, at least for the core models like stable diffusion itself, because there's just so much existing art (both official and even moreso copyright/trademark infringing fan art) of characters like Mickey Mouse anyways.
The thing the AI does is distill concepts and interactions between concepts shared between many input images, and can do so in a generalized way that allows concepts never before seen together to be mixed together easily. You aren't getting transformations of specific images out of the AI, or even small pieces of each trained image, you're instead getting transformations of learned concepts shared across many many many works. This is why the shredding analogy just doesn't work. The AI generally doesn't, and is not designed to, mimic individual training images. A single image changes the weights of the AI by such a miniscule amount, and those exact same weights are also changed by many other images the AI trains on. Generative AI is very distinctly different from tracing, or distributing mass information that's precisely specific enough to pirate content, or from transforming copyrighted works to make them less detectable.
To drive the point home, I'd like to expand on how the AI and its training is actually implemented, because I think that might clear some things up for anyone reading. I feel like the actual way in which the AI training uses images matters.
A diffusion model, which is what current AI art uses, is a giant neural network that we want to guess the noise pattern of an image. To train it on an image, we add some random amount of noise to the whole image (could be a small amount like film grain, or it could be enough to make the image completely noise, but it's random each time), then pass that image and its caption through the AI to get the noise pattern the AI guesses is in the image. Now we take the difference between the noise pattern it guessed and the noise pattern we actually added to the training image to calculate the error. Finally, we tweak the AI weights based on that error. Of note, we don't tweak the AI to perfectly guess the noise pattern or reduce the error to zero, we barely tweak the AI to guess ever so slightly better (like, 0.001% better). Because the AI is never supposed to see the same image many times, it has to learn to interpret the captions (and thus concepts) provided alongside each image to direct its noise guesses. The AI still ends up being really bad at guessing high noise or completely random noise anyways, which is yet another reason why it can't generally reproduce existing trained images from nothing.
Now let's talk about generation (aka "inference"). So we have an AI that's decent at guessing noise patterns in existing images as long as we provide captions. This works even for images that it didn't train on. That's great for denoising and upscaling existing images, but how do we get it to generate new unique images? By asking it to denoise random noise and giving it a caption! It's still really shitty at this though, the image just looks like some blobby splotches of color with no form, else it probably wouldn't work at denoising existing images anyways. We have a hack though: add some random noise back into the generated image and send it through the AI again. Every time we do this, the image gets sharper and more refined, and looks more and more like the caption we provided. After doing this 10-20 times we end up with a completely original image that isn't identifiable in the training set but looks conceptually similar to existing images that share similar concepts. The AI has learned not to copy images while training, but actually learned visual concepts. Concepts which are generally not copyrighted. Some very specific depictions which it learns are technically copyrighted, i.e. Mickey Mouse's character design, but the problem with that claim too is that there are fair use exceptions, legitimate use cases, which can often cover someone who uses the AI in this capacity (parody, educational, not for profit, etc). Whether providing a tool that can just straight up allow anyone to create infringing depictions of common characters or designs is legal is up for debate, but when you use generative AI it's up to you to know the legality of publishing the content you create with it, just like with hand made art. And besides, if you ask an AI model or another artist to draw Mickey mouse for you, you know what you're asking for, it's not a surprise, and many artists would be happy to oblige so long as their work doesn't get construed as official Disney company art. (I guess that's sorta a point of contention about this whole topic though isn't it? If artists could get takedowns on their mickey mouse art, why wouldn't an AI model get takedowns too for trivially being able to create it?)
Anyways, if you want this sort of training or model release to be a copyright violation, as many do, I'm unconvinced current copyright/IP laws could handle it gracefully, because even if the precise method by which AI's and humans learn and execute is different, the end result is basically the same. We have to draw new more specific lines on what is and isn't allowed, decide how AI tools should be regulated while taking care not to harm real artists, and few will agree on where the lines should be drawn.
Also though, Stable Diffusion and its many many descendents are already released publicly and open source (same with Llama for text generation), and it's been disseminated to so many people that you can no longer stop it from existing. That fact doesn't give StabilityAI a pass, nor do other AI companies who keep their models private get a pass, but it's still worth remembering that Pandora's box has already been opened.
I agree.
The problem is that you might technically be allowed to, but that doesn't mean you have the funds to fight every court case from someone insisting that you can't or shouldn't be allowed to. There are some very deep pockets on both sides of this.
Let them come.
I like your moxy, soldier
What a tough guy you are, based AI artist fighting off all the artcucks like a boss 😎
That's not it going both ways. You shouldn't be allowed to use anyone's IP against the copyright holders wishes. Regardless of size.
Nah all information should be freely available to as many people as practically possible. Information is the most important part of being human. All copyright is inherently immoral.
I'd agree with you if people didn't need to earn money to live. You can't enact communism by destroying the supporters of it. I fully support communism in theory, we should strive for a community based government. I'm also a game developer and when I make things I need to be able to pay my bills because we still live in capitalism.
If copyright was vigorously enforced a lot more people would starve than would be fed
It's already vigorously enforced. Maybe you can expand on what you mean.
Go on etsy search mickey mouse and go report all the hundreds of artists whose whole career is violating the copyright of the most litigious company on the planet.
As long as I'm not pretending to be Nintendo, can you quantify how exactly releasing, for example, a fan Mario game is unethical? It having the potential to hurt their sales if you make a better game than them doesn't count because otherwise that would imply that out-competing anyone in any market must be unethical, which is absurd.
No, it's about if you make a game that's worse or off-brand. If you make a bunch of Mario games into horror games and then everyone thinks of horror when they think of Mario then good or bad, you've ruined their branding and image. Equally, if you make a bunch of trash and people see Mario as just a trash franchise (like how most people see Sonic games) then it ruins Nintendo's ability to capitalize on their own work.
No one is worried about a fan-made Mario game being better.
Wouldn't that only be a problem if you pretended to be Nintendo? There are a lot of fan Sonic games and I don't think it's affected Sonic's image very much. There's shit like Sonic.EXE, which is absolutely a horror game, but people still don't think of horror when they think of Sonic. The reason Sonic is a trash franchise is because of SEGA consistently releasing shitty Sonic games. Hell, some of the fan games actually do beat out the official games in quality and polish.
Hypothetically, assuming you're right and fan games actually do hurt their branding, there's still no reason for it to be illegal. We don't ban shitty mobile games for giving good mobile games a bad rep, and neither Nintendo nor mobile devs can claim libel or defamation.
Actually we do. If your game is so terrible, it can get removed from the Google Play store. It has to be really bad and they rarely do it. Steam does this as well. Epic avoids this by vetting the games they put on their platform site first. It's why the term asset flip exists.
Ban in a legal sense, not ban from a proprietary store.
Sure, that's fair. Either way bad or good, it's illegal to take someone's existing IP and add on to it. Good or bad. No one is legally banning games based on quality and if they were good or bad it doesn't matter. It matters that the original story owner has a vision and they have the right to make money off of their work without other people trying to take or add on to the story themselves. Brand fatigue is a real thing and while all of the CoD games are great and fairly high quality, the reason they get a bad rap is exactly brand fatigue. The Assassin's Creed games were the same for a bit but they resolved that by spreading releases out.
Either way, the arguments to let people just add on to what they want seems to fall flat. Why not just build a universe like Mario and call it Maryo? Or the great giana sisters? Why involve someone else's IP at all? because you are profiting off of the popularity that someone else built with quality products. In even a communist society I'd want that banned because it's lazy, misrepresents the original vision, and overall it's completely avoidable without a problem. In fact, why not force people to create new things instead of letting them be lazy and stealing the popularity of a well-made IP?
Brand fatigue is definitely a real thing, I will agree with you there. In the same boat, so is genre fatigue. If you play a lot of platformers, you'll eventually get bored and move on to something else. Nintendo clearly has no right to go after other platforming games that have copied from Mario's mechanics, so why should they have the right to go after games that use Mario's aesthetic or name?
Fan games are not normally morally wrong, but I do think it's kind of trashy to try and make money off of someone else's brand if you're not doing it out of passion. I just don't see it as a legal problem, much like how crappy off-brand ripoffs and lazily made games aren't a legal problem. It's also worth noting that the people making the most money off of the IP are just executives that likely had little to no part in creating the characters and care less about appreciating the artistic work than most fan game makers.
Look at Sonic P-06, a fan-made remake of Sonic '06. It's highly polished, has fleshed out features that never made it into the original game, and is absolutely a labor of love. The original came out a buggy pile of garbage because it was forced out by those aforementioned business-people before it was ready. Most companies' strict protection of their IP just prevent works of art like P-06 from seeing the light of day. I think that SEGA's no Sonic monetization policy being enshrined into law would be a reasonable compromise.
This argument doesn't hold water to me. How are you conflating a brand with all games in a genre? IP and brands are a signifier of what to expect. Sherlock Holmes is a great example of what used to be an IP known for a very specific style of mystery story but now just means "genius problem solver maybe with a drug habit."
We are going to just have to agree to disagree. Spiritual successors happen all the time. The reason people make fan games is that there is a lexicon built into their project already. It's a shortcut instead of building (and considering) what is meaningful to your game in a lexicon. Additionally, a lot of people do not consider how their changes affect the lexicon throughout all games. So what you are left with is mostly people who don't truly understand, never talked to the creators, never worked with them, assume everything from a product perspective, pushing out something that adds to the brand without any true coherency or consideration of future titles.
I for one see it as wrong to attempt to take someone's work and not only pass it off as your own but also potentially break their ability to make future iterations of their work.
Perhaps a labor of love. From my initial parsing of reviews, looks like they didn't attempt to change anything but kept close to the source material.
I feel like that's okay. There are plenty of original and better ideas out there. It doesn't prevent things from being made. P-06 being "still terrible but better" doesn't really say anything. The fact that games like Black Mesa, Sonic Media, Skywind, and very specifically Zera: Myths Awaken also exists, a game that started as Spyro 4 and Activision sent a C&D, so they rebranded. These things are easily fixed and honestly, it's great if fans want to attempt to build a game off of someone's IP then ask the IP holder if they can continue with it. If not, they can just rebrand and create their own universe. Temtem is a great example of what happens when people are forced to create their own thing. It becomes more impactful and allows for a more interesting product. Not just a cookie cutter game.
I get what you're saying. The problem is that you can't argue your case just by giving examples of how IP enforcement can lead to people innovating more in the way that I can. The debate is asymmetrical because all I have to do is show that IP violation doesn't uniquely negatively affect IP holders in ways that legal activities can't, demonstrating precedent for that kind of competition/harm being legal. You need to justify forcibly imposing limitations on what people are allowed to do, which has a higher "burden of proof" if you get what I'm saying.
I don't think that's all you have to show. In fact I feel like a lot of your examples have missed the point entirely. The point isn't that there are other ways that could maybe impact the sales or recognition a game gets. This conversation also jumps between copyright and trademark protections. Copyright is about protecting actual assets. You take my assets and make something else with them without my consent, you've stolen my work. That's a bad, immoral thing. Equally ties to the copyright of characters and story. You take the building blocks I've made, you've stolen my work. Trademark is about protecting brand recogniztion and deals with IP violations.
With that covered, the point of copyright is to ensure the people who did the work get paid for their work. That large or small companies don't have to worry about someone stealing their works and allowing them to innovate.
So when you say, whatever people will just make another game in the genre, that's innovation. What if it's bad? Not everything is good, that's still positive innovation. You always mentioned large IPs but the truth is that the law is equal and that large corporations are already taking people's work without asking. We do not need more of that.
For fan made games, there isn't a huge point to do them without the blessing of the IP holder and you might point to large studios vs small fans but think of it in every scale. Especially middle to small studios being stolen from. It's just a fan game doesn't really hold water when it's potentially putting people out of business because of the issues I've already shown. Imagine if copyright wasn't enforced and someone just re-uploaded existing games. Where do you draw a line on the charges a fan game needs to make in order to qualify. I can tell you right now 99% of players wouldn't buy a game on steam if there was another fan game of it exactly but for free. So then you have to draw up all these lines that are frankly unfair to the creators. So just let them choose. Their works belong to them. Not to just people who might like the game.
So I don't see any point for which a looser copyright law would be overall more helpful to society. We need courts to allow for smaller creators to justify fair use but that doesn't cover anything we talked about.
"Artists don't deserve to profit off their own work" is stupid as shit. Complain about copyright abuse and lobbying a la Disney and I'll be right there with you, but people shouldn't have the right to take your work and profit off it without either your consent or paying you for it.
Artists and other creatives who actually do work to create art (not shitting out text into an image generator) should take every priority over AI "creators."
No you don't understand, the machine works exactly like a human brain! That makes stealing the work of others completely justifiable and not even really theft!
/s, bc apparently this community has a bunch of dumbass tech bros that genuinely think this
This but mostly unironically. And before you go Inzulting me I'm an artist myself and wouldn't be where I am if I wasn't allowed to learn from other people's art to teach myself.
Having a masters degree in machine learning doesn't make you an artist grow tf up xD
No, but drawing for last 13 years of my life does.
don't dismiss me, I like being spit on
And this, is a strawman. If this argument is being made, it's most likely because of their own misunderstanding of the subject. They are most likely trying to make the argument that the way biological neural networks and artificial neural networks 'learn' is similar. Which is true to a certain extent since one is derived from the other. There's a legitimate argument to be made that this inherently provides transformation, and it's exceptionally easy to see that in most unguided prompts.
I haven't seen your version of this argument being spoken around here at all. In fact it feels like a personal interpretation of someone who did not understand what someone else was trying to communicate to them. A shame to imply that's an argument people are regularly making.
Equating training AI to not being able to profit is stupid as shit and the same bullshit argument big companies use to say "we lost a bazillion dollars to people pursuing out software" someone training their AI on an art work (that is probably under a creative commons licence anyway) does suck money out of an artists pocket they would have otherwise made.
Why are you the one that gets to decide what is "work" to create art? Should digital artists not count because they are computer assisted, don't require as much skill and technique as "traditional" artists and use tools that are based on the work of others like, say, brush makers?
And the language you use shows that you're vindictive and angry.
My brother in Christ, they didn't even allude to this, this is an entirely new thought.
Yeah no shit sherlok. I'm applying their flawed logic to other situations, where the conclusion is even more dumb so he can see that the logic doesn't work.
But leaping from 'copyright is made-up' to 'so artists should starve?' is totally reasonable discourse.
Explain what you think will happen if artists who depend on their art to make a living lose copyright protection.
Commissions, patronage, subscriptions, everything else rando digital artists do when any idiot can post a JPG everywhere and DMCA takedowns accomplish roughly dick.
Meanwhile - you wanna talk about people who've been fucked over by corporations that decide their original artwork is too close to something a dead guy made?
Or look into whether professional artists were having a good time, before all this? Intellectual property laws have funded and then effectively destroyed countless years of effort by artists who aren't even allowed to talk about it due to NDAs. CGI firms keep losing everything and going under while the movies they worked on make billions. The status quo is not all sunshine and rainbows. Pretending the choice is money versus nothing is deeply dishonest.
“Just internet beg” oh okay. Shows exactly how much you value the people making the art you want to feed into the instant gratification machine.
'How money?' Existing lucrative business ventures by quite a lot of artists. 'So beg!' Yeah you got me, how intellectually sincere, gold sticker, you can leave.
They said IP, IP protects artists from having their work stolen. The fact AI guzzlers are big mad that IP might apply to them too is irrelevant.
Digital artists do exactly as much work as traditional artists, comparing it to AI “art” from an AI “artist” is asinine. Do you actually think digital artists just type shit in and a 3D model appears or something?
And yeah I’m angry when my friends and family who make their living as actual artists, digital and traditional, have their work stolen or used without their permission. They aren’t fucking corporations making up numbers about lost sales, they’re spending weeks trying to get straight up stolen art mass printed on tshirts and mugs removed from online sale. They’re going outside and seeing their art on shit they’ve never sold. Almost none of them own a home or even make enough to not have a regular job, it’s literally taking money out of their pockets to steal their work. This is the shit you’re endorsing by shitting on the idea of IP.
Do you actually think artists using AI tools just type shit into the input and output decent art? It's still just a new, stronger digital tool. Many previous tools have been demonized, claiming they trivialize the work and people who used them were called hacks and lazy. Over time they get normalized.
And as far as training data being considered stealing IP, I don't buy it. I don't think anyone who's actually looked into what the training process is and understands it properly would either. For IP concerns, the output should be the only meaningful measure. It's just as shitty to copy art manually as it is to copy it with AI. Just because an AI used an art piece in training doesn't mean it infringed until someone tries to use it to copy it. Which, agreed, is a super shitty thing to do. But again, it's a tool, how it's used is more important than how it's made.
Lmao, I’ve used AI image generation, you’re not going to be able to convince me any skill was involved in what I made. The fact some people type a lot more and keep throwing shit at the wall to see what sticks doesn’t make it art or anything they’ve done with their own skill. The fact none of them can control what they’re making every time the sauce updates is proof of that.
If it’s so obviously not IP violating to train with it then I’m sure it’ll be totally fine if they train them without using artists’ work without permission, since it totally wasn’t relying on those IP violating images. Yet for some reason they fight this tooth and nail. 🤔
Except they totally could. But a data source of such size of material where everyone opted in to use for AI explicitly does not exist. The reason they fight it is in part also because training such models isn't exactly free. The hardware that it's done on costs hundreds of thousands of dollars and must run for long periods. You would not just do that for the funsies unless you have to. And considering the data by all means seems to be collected in legal ways, they have cause to fight such cases.
It's a bit weird to use that as an argument to begin with since a party that knows they are at fault usually settles rather than fight on and incur more costs. It's almost as if they don't agree with your assertion that they needed permission, and that those imagines were IP violating 🤔
Oh no it costs money to use the art stealing machine to make uncopyrightable trash? 🥺
"But a data source of such size of material where everyone opted in to use for AI explicitly does not exist. "
Dang I wonder why 🤔🤔🤔🤔
Because AI wasn't a big thing before 2020, and no such permission in obtained material has been legally necessary so far (lawsuits are pending of course). If something has no incentive to exist it will not be created. There's plenty of ethical justifications why no such permission is needed as well.
Oh cool, you think art theft is ethically justified as long as it's a robot doing it
But sure, go ahead! I'm sure you'll change so many minds by immediately disregarding everything they say by putting them in a box of "thiefs" because they said something that didn't fit very specifically within your "Guidebook to hating anything related to AI".
Now back to a serious discussion if you're up for it. Creative freedom is built on the notion that ideas are the property of nobody, it is a requirement since every artist in existence has derived their work from the work of others. It's not even controversial, using your definition of stealing means all artists 'steal' from each other all the time, and nobody cares. But because a robot does it (despite that robot being in 100% control of the artist using it), it's suddenly the most outrageous thing.
I know for sure my ideas have been 'stolen' from my publicized works, but I understand I had no sole right to that idea to begin with. I can't copyright it. And if a 'thief' used those ideas in a transformative manner rather than create something that tries to recreate what I made (which would be actual infringement), they have every right to as without that right literally nobody would be allowed to make anything since everything we make is inspired by something that we don't hold a copyright over. Most of the people actually producing stuff that will be displayed publicly so other people will experience and pull it apart to learn from understand we have no right to those ideas to begin with, except in how we applied those ideas in a specific work.
Think of how much actual art you could've made in the time it took you to shill for the thing that's stealing art and fucking over creatives and using personal data to spy on people
Oh yeah, shame on me, spending a part of my day 'shilling' for myself and my friends and colleagues. And 'shilling' for a better future for us all by dissuading people from weaponizing bad arguments and misunderstandings to defend themselves, because that will not help them one bit. The latter part of you sentence is such utter nonsense that I don't even need to respond to that.
First of all, your second point is very sad to hear, but also a non-factor. You are aware people stole artwork before the advent of AI right? This has always been a problem with capitalism. It's very hard to get accountability unless you are some big shot with easy access to a lawyer at your disposal. It's always been shafting artists and those who do a lot of hard work.
I agree that artists deserve better and should gain more protections, but the unfortunate truth is that the wrong kind of response to AI could shaft them even more. Lets say inspiration could in some cases be ruled to be copyright infringement if the source of the inspiration could be reasonably traced back to another work. This could allow companies big companies like Disney an easier pathway to sue people for copyright infringement, after all your mind is forever tainted in their IP after you've watched a single Disney movie. Banning open source models from existing could also create a situation where the same big companies could create internal AI models from the art in their possession, but anyone with not enough materials could not. Which would mean that everyone but the people already taking advantage of artists will benefit from the existence of the technology.
I get that you want to speak up for your friends and family, and perhaps they do different work than I imagine, but do you actually talk to them about what they do in their work? Because digital artist also use non-AI algorithms to generate meshes and images. (And yes, that could be summed down to 'type shit in and a 3D model appears') They also use building blocks, prefabs, and use reference assets to create new unique assets. And like all artists they do take (sometimes direct) inspiration from the ideas of others, as does the rest of humanity. Some of the digital artists I know have embraced the technology and combined it with the rest of their skills to create new works more efficiently and reduce their workload. Either by being able to produce more, or being able to spend more time refining works. It's just a tool that has made their life easier.
All of that is completely irrelevant to the fact that image generators ARE NOT PEOPLE and the way that people are inspired by other works has absolutely fuck all to do with how these algorithms generate images. Ideas aren’t copyrightable but these algorithms don’t use ideas because they don’t think, they use images that they very often do not have a legal right to use. The idea that they are equivalent is a self serving lie from the people who want to drive up hype about this and sell you a subscription.
I watch my husband work every day as a professional artist and I can tell you he doesn’t use AI, nor do any of the artists I know; they universally hate it because they can tell exactly how and why the shit it makes is hideous. They spot generated images I can’t because they’re used to seeing how this stuff is made. The only thing remotely close to an algorithm that they use are tools like stroke smoothing, which itself is so far from image generation it would be an outright lie to equate them.
Companies aren’t using this technology to ease artist workloads, they’re using it to replace them. There’s a reason Hollywood fought the strike as hard as they did.
The fact they are not people does not mean they can't use the same legal justifications that humans use. The law can't think ahead. The justification is rather simple, the output is transformative. Humans are allowed to be inspired by other works because the ideas that make up such a concept can't be copyrighted because they can be applied to be transformative. If the human uses that idea to produce something that's not transformative, it's also infringement. AI currently falls into that same reasoning.
You call it a self serving lie, but I could easily say that about your arguments as well that you only have this opinion because you don't like AI (it seems). That's not constructive, and since I hope you care about artists as well, I implore you actually engage in good faith debate other than just assuming the other person must be lying because they don't share your opinion. You are also forgetting that the people that benefit from image generators are people. They are artists too. Most from before AI was a thing, and some because it became a thing.
Again, sorry to hear your husband feels that way. I feel he is doing himself a disservice to dismiss a new technology, as history has not been kind to those who oppose inevitable change, especially when there are no good non-emotional reasons against this new technology. Most companies have never cared about artists, that fact was true without AI, and that fact will remain true whatever happens. But if they replace their artists with AI they are fools, because the technology isn't that great on it's own (currently). You need human intervention to use AI to make high quality output, it's a tool, not the entire process.
The Hollywood strikes is a good example of what artists should be doing rather than making certain false claims online. Join a union, get protection for your craft. Just because something is legal doesn't mean you can't fight for your right to demand specific protections from your employer. But they do not affect laws, they are organizational. It has no ramifications on people not part of the guilds involved. If a company which while protecting their artists, allows them to use AI to accelerate their workflow, and comes out on top against the company that despite their best intentions, made their art department not as profitable anymore, that will also cause them to lose their jobs. Since AI is very likely not to go away completely even in the most optimistic of scenarios, it's eventually a worse situation than before.
And lastly, I guess your husband does different work than the digital artists I work with then. You have a ton of generation tools for meshes and textures. I also never equated it directly to AI, but you stated that they use no tools which do all the work for them (such as building a mesh for them), which is false. You wouldn't use the direct output of AI as well. I implore you to look at "algorithmic art": https://en.wikipedia.org/wiki/Algorithmic_art
"I can" and "you can't" are not opposites.
I have no idea what you're trying to convey here.
“Artists don’t deserve to profit off their own work” is not what anyone said.
Even if people can just take your shit and profit off it - so can you.
This is not a complete rebuttal, but if you need the core fallacy spelled out, let's go slowly.
Why are you entitled to profit off the labor of someone else?
Why am I not entitled to profit from my own work building on stories I enjoy?
Isn't this a completely different conversation than the one we were having and kind of missing the point? Yes, imo you should be allowed to do that. Still, AI Companies are using the labor of millions of artist for free to train their AIs, which are then threatening to eliminate ways of these artist to gather income.
How is that related in any way to the ways that copyright has been exploited against fanmade art?
The conversation began with 'fuck copyright,' so no, restraints on new works are at least as relevant as money.
Restraints on derivative works seem directly relevant to railing against AI training. I don't think fanart goes on your side of the table. You're taking a stand for amateur references to immense professional works. Presumably on the basis that Disney can keep doing its thing no matter how many people draw their own weird Zootopia comics - yeah? I for one would argue the environment someone grew up in is fair game for them to build from, as much for Star Wars as for ancestral fairy tales.
The environment of the internet is a pile of everyone's JPGs. We think nothing of amateur galleries where people mimic popular styles, borrow characters, and draw frankly unreasonable quantities of low-quality pornography. I'm not up-in-arms about AI because it's more of the same. I barely understand the objection. An entity learned English from library books? Yeah, that's how everything that can read English learned to read English. You couldn't understand this sentence without exposure to countless examples of text that were not explicitly provided for your education. So if there's a pile of linear algebra that can emit drawings, and it was shaped by looking at a ton of drawings that other people posted-- then-- as opposed to what?
The economic concerns are simpler: Hollywood is doomed. This is refrigeration, and they sell ice for iceboxes. They imagine it's going to make ice much easier to sell, since they won't need people to harvest and import it. In reality their entire business model is fucked, because their customers also won't need people to harvest and import it.
If they don't need a studio full of artists to make a cartoon movie... neither do you. Neither do all the artists they cast off. We cannot be far from models that tween pretty damn well on their own, and can be guided to tween flawlessly. The near future is not about to have less human art. No more than when Flash obviated colored paint on clear plastic.
Whether or not that's going to make anyone appropriate amounts of money under late capitalism is another question entirely, but it's not like artists were famously well-off before, and in any case Disney delenda est.
Even if text to image generators are able to improve to be better than human artists, people won't stop making art just because a computer can do it faster.
But they will stop hiring artists, and that's more to the point of what they were saying. We're already seeing some jobs being replaced with algorithms (mostly stuff like shitty click bait journalism, but still), and art has long been considered a skill not worth paying for. In centuries past, art used to be something only the rich could afford. Now, people get upset if artists charge $60 for a commission.
The algorithms won't need to produce work better than we can, or even equal. It just needs to make stuff that seems value appropriate. People have already made algorithms to imitate certain popular artists' styles, and they've seen a hit to their income as a result. Why get a commission done from one of them when you can go online and get 50 for free that are kinda close, and then just pick the one you like.
People have been getting automated out of their jobs for well over a century. Technology shouldn't stop advancing; everyone should be compensated for the human labor saved through the use of automation.
What entitles someone to take another person's work and profit off it?
No.
The subject is intellectual property, in general.
Why am I not entitled to share culture?
Why am I not entitled to create, if similarity exists?
Why does someone get to own an idea, just because they wrote it down first?
Is the world's copyright system flawed? Yes. Should it be completely removed? No, because otherwise a lot of creative branches would be unsustainable. Artists need money, musicians need money etc.
I'm not against copyright. I'm trying to guide this other use through why their post is a nonsensical response to someone who is.
If you don't want to defend what you said that's fine but I'm not going to pretend we were actually talking about something else. 🤷♂️ "Share culture" is not when you take someone else's drawing and dropship hundreds of shittily made tshirts on the Facebook marketplace. That's what IP protects artists from and fighting stuff like that takes up a stupid amount of time for anyone that isn't a corporation.
If you are creating that's absolutely fine. But shit you typed into AI isn't creating anything and literally couldn't exist without the people that actually create art.
So you still have no idea what I'm trying to convey here.
You're fixated on one aspect, and ignoring all other consequences. "If you are creating that’s absolutely fine" is NOT what any version of copyright law says. Not ever. So demanding an answer to an explanation of why your first response was a strawman is not the mic-drop you think it is.
Again:
“Artists don’t deserve to profit off their own work” is a position you made up. They're your words. It's a thing you, and you alone, have said. But that's never the same thing as whether anyone else can. This is such a basic 'not-all doesn't mean none' distinction, and it is the only reason I wrote the only words you chose to read.
Look at who they responded to. The statement is not a straan when the person that was responded to doesn't believe in respect the rights of creators.
As if copyright never censors creators.
Fuck your sophistry.
If you don't want to discuss this, leave.
No. Fuck that. I don't consent to my art or face being used to train AI. This is not about intellectual property, I feel my privacy violated and my efforts shat on.
Unless you have been locked in a sensory deprivation tank for your whole life, and have independently developed the English language, you too have learned from other people's content.
Well my knowledge can't be used as a tool of surveillance by the government and the corporations and I have my own feelings intent and everything in between. AI is artifical inteligence, Ai is not an artificial person. AI doesn't have thoughts, feelings or ideals. AI is a tool, an empty shell that is used to justify stealing data and survelience.
This very comment is a resource that government and corporations can use for surveillance and training.
AI doesn't have thoughts? We don't even know what a thought is.
We may not know what comprises A thought, but I think we know it's not matrix math. Which is basically all an LLM is
Hard disagree, the neural connections in the brain can be modeled with matrix math as well. Sure some people will be uncomfortable with that notion, especially if they believe in spiritual stuff outside physical reality. But if you're the type that thinks consciousness is an emergent phenomenon from our underlying biology then matrix math is a reasonable approach to representing states of our mind and what we call thoughts.
Yet we live in a world where people will profit from the work and creativity of others without paying any of it back to the creator. Creating something is work, and we don't live in a post-scarcity communist utopia. The issue is the "little guy" always getting fucked over in a system that's pay-to-play.
Donating effort to the greater good of society is commendable, but people also deserve to be compensated for their work. Devaluing the labor of small creators is scummy.
I'm working on a tabletop setting inspired by the media I consumed. If I choose to sell it, I'll be damned if I'm going to pay royalties to the publishers of every piece of media that inspired me.
If you were a robot that never needed to eat or sleep and could generate 10,000 tabletop RPGs an hour with little to no creative input then I might be worried about whether or not those media creators were compensated.
The efficiency something can be created with should have no bearing on whether someone gets paid royalties.
It absolutely should, especially when the "creator" is not a person. AI is not "inspired" by training data, and any comparisons to human artists being inspired by things they are exposed to are made out of ignorance of both the artistic process and how AI generates images and text.
It's impossible to make any comparison between how AI and how humans make decisions without understanding the nature of consciousness. Simply understanding how AI works isn't enough.
Are you seriously suggesting that human creativity works by learning to reduce the amount of random noise they output by mapping words to patterns?
Is that what they suggested? Or are you just wanting to be mad about something?
Fair, my point was: how is it important that we understand how consciousness works to see that the way consciousness creates Art is not very comparable to a machine recognizing patterns?
The commentor above has compared inspiration to the way AI Companies are using the labor of millions of artists for free. In this context I assume this is what they were hinting at when responding to "AI is not being inspired" with "we don't know how consciousness works"
Because one is a black box that very well may just be a much more advanced version of what current AI does. We don't know yet. It's possible that with the training of trillions of trillions of moments of experience a person has that an AI may be comparable.
I mean, the likelihood is basically zero, but it's impossible to prove the negative. At the end of the day, our brains are just messy turing machines with a lot of built in shortcuts. The only things that set us apart is how much more complicated they are, and how much more training data we provide it. Unless we can crack consciousness, it's very possible some day in the future we will build an incredibly rudimentary AGI without even realizing that it works the same way we do, scaled down. But without truly knowing how our own brain works fully, how can we begin to claim it does or doesn't work like something else?
None of which means it is impossible to determine whether or not an algorithm that couldn't exist without the work of countless artists should have the same IP rights as a human being making art (the answer is no).
When, not if, such a robot exists, do you imagine we'll pay everyone who'd ever published a roleplaying game? Like a basic income exclusively for people who did art before The Training?
Or should the people who want things that couldn't exist without magic content engines be denied, to protect some prior business model? Bear in mind this is the literal Luddite position. Weavers smashed looms. Centuries later, how much fabric in your life do you take for granted, thanks to that machinery?
'We have to stop this labor-saving technology because of capitalism' is a boneheaded way to deal with any conflict between labor-saving technology... and capitalism.
I know this is shocking to people who have never picked up a brush or written anything other than internet arguments after the state stopped mandating they do so when they graduated high school, but you can just create shit without AI. Literal children can do it.
And there is no such thing as something that “couldn’t exist” without the content stealing algorithm. There is nothing it can create that humans can’t, but humans can create things it can’t.
There’s also something hilarious about the idea that the real drag on the artistic process was the creating art part. God almighty, I’d rather be a Luddite than a Philistine.
Fuck off.
If that sounds needlessly blunt, no, it's far kinder than your vile opening insult, which you can't even keep straight - immediately noting that everyone, even children, has done an art at some point.
So maybe this discussion of wild new technology versus 17th-century law and the grindstone of capitalism isn't about individual moral failings of people who disagree with you.
Ooh, hit close to home to point out AI guzzlers are talentless hacks? Take a nap bro, you seem to need it.
Those 10000 tabletop RPGs will almost certainly be completely worthless on their own, but might contain some novel ideas. Until a human comes by and scours it for ideas and combines it. It could very well be that in the same time it could only create 1 coherent tabletop RPG idea.
Should be mentioned though, AIs don't run for free either, they cost quite a lot of electricity and require expensive hardware.
what a shitty argument. AI isn't people
Then don't post your art or face publicly, I agree with you if it's obtained through malicious ways, but if you post it publicly than expect it to be used publicly
If you post your art publicly why should it be legal for Amazon to take it and sell it? You are deluding yourself if you believe AI having a get out of jail free card on IP infringement won't be just one more source of exploitation for corporations.
Taking it and selling it is obviously not legal, but taking it and using it for training data is a whole different thing.
Once a model has been trained the original data isn't used for shit, the output the model generates isn't your artwork anymore it isn't really anybody's.
Sure, with some careful prompts you can get the model to output something in your style fairly closely, but the outputting image isn't yours. It's whatever the model conjured up based on the prompt in your style. The resulting image is still original of the model
It's akin to someone downloading your art, tracing it over and over again till they learn the style and then going off to draw non-traced original art just in your style
"You don't understand, it's not infringement because we put it in a blender first" is why AI "art" keeps taking Ls in court.
Care to point to those Ls in court? Because as far as I'm aware they are mostly still ongoing and no legal consensus has been established.
It's also a bit disingenuous to boil his argument down to what you did, as that's obviously not what he said. You would most likely agree that a human would produce transformative works even when basing most of their knowledge on that of another (such as tracing).
Ideas are not copyrightable, and neither are styles. You could use that understanding of another's work to create forgeries of their work, but hence why there exist protections against forgeries. But just making something in the style of another does not constitute infringement.
EDIT:
This was pretty much the only update on a currently running lawsuit I could find: https://www.reuters.com/legal/litigation/us-judge-finds-flaws-artists-lawsuit-against-ai-companies-2023-07-19/
Hard to call that an L, so I'm eagerly awaiting them.
Biggest L: that shit ain’t copyrightable. A doodle I fart out on a sticky pad has more legal protection.
That's true, but the reason behind that is reasonable. There has been no human intervention, and so like photos taken by animals, they hold no copyright. But that's not what you should be doing anyways, a tool must be used as part of a process, not as *the process*.
A court saying "nobody owns this" is closer to the blender argument than the ripoff argument.
If the large corporations can use IP to crush artists, artists might as well try to milk every cent they can from their labor. I dislike IP laws as well, and you can never use the masters' tools to dismantle their house, but you can sure as shit do damage and get money for yourself.
Luckily, AI aren't the master's tools, they're a public technology. That's why they're already trying their had at regulatory capture. Just like they're trying to destroy encryption. Support open source development, It's our only chance. Their AI will never work for us. John Carmack put it best.
AI algorithms aren't the masters tools in the sense that anyone can set up a model using free code on cheap tech, but they do require money to improve and produce quality products. The training data requires labor of artists to produce, the computers the model runs on require labor to make, and the electricity that allows the model to continuously improve requires, you guessed it, labor. Labor will always, and should always, be expensive, making AI most useful to those with money to spend.
IP is only one tool in the masters' tool box, but capitalism is their box, and the other forms of capital can be used to swing the public's tools far harder than a wage laborer.
Hence why it's important that open source models should keep the support of the public. If they have to close shop, what remains will be those made by tech giants. They will be censored, they will be neutered, except if you pay enough for them. The power should remain with the artists, and not those with the money.
"They have the plant, but we have the power."
Here is an alternative Piped link(s):
They have the plant, but we have the power
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I'm open-source; check me out at GitHub.
Based take.