"Famous" AI Artist Says He's Losing Millions of Dollars From People Stealing His Work

Arthur Besse@lemmy.ml to Not The Onion@lemmy.world – 628 points –
Famous AI Artist Says He's Losing Millions of Dollars From People Stealing His Work
gizmodo.com

cross-posted from: https://hexbear.net/post/3613920

https://archive.ph/tR7s6

Get fuuuuuuuuuuuuuucked

“This isn’t going to stop,” Allen told the New York Times. “Art is dead, dude. It’s over. A.I. won. Humans lost.”

"But I still want to get paid for it."

174

Stupid tardigrade doesn't even know how to play a violin

Good thing it's got a cello then.

One thing I know about violins is that they're smaller than cellos. Cellos are what, 4 feet long? That tardigrade is like 1mm big or something, much smaller than a cello. Therefore, it's holding a violin. Or maybe a bowed mountain dulcimer. /kidding

1mm? Dude, the scale is in the image, that's 150μm, one tenth that size. That viola is only 50μm long.

Stupid human doesn't even know what a violin looks like.

NGL, I am pretty tired and have my glasses off, thought he was holding a sword and shield and thought this was pretty cool.

per Wikipedia

On September 21, 2022, Allen submitted an application to the us copyright office for registration of the image. Prior to the first formal refusal, the Copyright Office Examiner requested that the request would exclude any features of the image generated by Midjourney. Allen declined the request and requested copyright for the whole image.

So what I'm getting from that is his Photoshop edits aren't significant enough to constitute a copyrightable work on their own and the copyright office was right to deem it a non-human production.

I'm just happy someone at the copyright office knows what they're doing

This has been the copyright office's stance for quite a while now. Actually, most of the world's respective IP registrars and authorities do not grant IP rights to AI generated material.

I'm glad about this, honestly.

If you want to use an AI model trained on vast sums of publicly posted work, go for it, but be ready for the result to be made into a truly public work that you don't own at the end of it all.

I agree. I think the effective entry into the public domain of AI generated material, in combination with a lot of reporting/marking laws coming online is an effective incentive to keep a lot of material human made for large corporate actors who don't like releasing stuff from their own control.

What I'd like to see in addition to this is a requirement that content-producing models all be open source as well. Note, I don't think we need weird new IP rights that are effectively a "right to learn from" or the like.

I'm 100% in favor of requiring models to be open source. That's been my belief for a while now, because clearly, if someone wants to make an AI model off the backs of other people's work, they shouldn't be allowed to restrict or charge access to those models to the same people who had their work used, let alone other people more broadly.

Another idiot who thinks "prompt engineering" is a real skill and not just another step those companies are using idiots for free AI training.

You ask AI to draw a ninja turtle on a skateboard, and that "effort" they put into phrasing their request well enough for the AI to understand makes the AI learn the 10 past attempts were looking for what the 11th got

And now it won't take ten tries to go that route

Any "skill" by the user has a very short expiration date because the next version won't need it thanks to all the time users spent developing those "skills".

But no one impressed with AI is smart enough to realize that. And since they're the on s training the AI....

Idiots in, idiots out

"Promp engineering" is as useful skill as Google fu used to be.

I completely agree. I wonder whether some IT bachelor's degrees now have lessons in AI prompting. I remember in 2005 there was a course we had to do which could've been labeled "[shitty] Google-Fu" or something. "information searching" is what it would more or less translate to. Basically searching using Google and library searches well. And I don't mean "library" in the IT-context, but actual libraries. With books. Just had to use the search tools the locals libraries had.

Such a fucking filler class.

In my year like 60 started, two classes. After three years like 8 graduated.

It's kinda dead now due to enshittification but the vast majority of humans I've interacted with could use a class on how to use a search engine.

Edit- it could be made more modern by showing how to ignore sponsored stuff, blatantly SEO shit, AI shit, etc

If the class had actually had any useful information in it, sure.

It was not the greatest class.

I've worked with tons of people who do not understand how to effectively use search engines. Maybe this was done poorly but it seems reasonable enough to me in principle.

I don’t know about that, in particular, because people generally add more detail, but it teaches the AI what kind of detail to add. So if you’re not picky, then yeah, the AI learns from that kind of thing.

As far as it being a useful skill, I don’t think it was in the first place. “Prompt engineer” has always been a joke. It’s like being a “sandwich artist”. Everyone can do it with one day of practice.

You don't have a clue how ai works do you?

Can you point out what's supposedly wrong with their comment or are you just claiming that every critic of so-called "AI" doesn't have a clue to justify the hype?

I use ai when I use search engines. This makes the search engines better. I also use ai when I get spotify suggestions. I use ai when I use autocorrect. I use ai without even realizing I'm using ai and the ai improves from it, and I and many other people get an improved quality of life from it, that's why nearly everyone uses it just like I do.

So, @givesomefucks , do you also regularly use ai that improves from from your usage? Or are you not a hypocrite who thinks there is something morally bad about specific ais that you don't like while doing exactly what you claim to be against with other ais? How are your moral lines drawn?

Thanks for the example!

Whether an individual determines AI "smart" depends on how smart the person is. We're all all our own frame of reference.

I have no doubt AI impresses you every day of your life, even stuff that's not AI apparently, because not all of your examples were.

You are just ignorant of the history and evolution of the term "AI". It's easy for anyone to learn about it's history, your point of view is just one of ignorance of the past.

Thanks for demonstrating what a useless term "AI" is when you're not trying to sell snake oil.

Every word in every language changes over time. The term AI changing is the absolute normal. It's not some mark against it.

Current llms are phenomenally beneficial for some things. Millions of developers have had their entire careers completely changed. Teachers are able to grade work in 10% of the time. Children through to college students and anyone interested in learning have infinitely patient tutors on demand 24 hours a day. The fact that you are completely clueless about what is going on doesn't by any stretch of the imagination mean it isn't happening. It just means that you not only feel like you are "beyond learning", it also means that you don't even have people in your life that are still interested in personal growth, or you are too shallow to have conversations with anyone who is.

This is just beginning. The more you cling to being in denial of progress, the further you will get behind. You are denying any mode of transportation other than horses even exists, while people are routinely flying around the world. It most likely won't be too long until your mindset is widely accepted as a mental disorder.

Every word in every language changes over time. The term AI changing is the absolute normal. It's not some mark against it.

Lumping machine learning algorithms, llms, regressive learning, search algorithms all in one bucket and calling it "AI" serves no proper purpose. There is no consensus, it's not a clear definition, it's not convenient and it only helps sell bullshit. Llms aren't intelligent. Calling them that is the opposite of useful.

Current llms are phenomenally beneficial for some things.

Namely: the portfolio of tech shareholders and grifters.

Millions of developers have had their entire careers completely changed.

Lol, no. What's your source for this?

Teachers are able to grade work in 10% of the time.

Poor students.

Children through to college students and anyone interested in learning have infinitely patient tutors on demand 24 hours a day.

Have you heard of the stories where students believed some AI bullshit more than what their teacher told them? Great "tutor" you have there.

The fact that you are completely clueless about what is going on

Sure, bud. /s

It just means that you not only feel like you are "beyond learning", it also means that you don't even have people in your life that are still interested in personal growth, or you are too shallow to have conversations with anyone who is.

Oh, please tell me more about my life, stranger on the internet! /s

What an asshole, seriously.

Have fun in your tech cult, you ableist bootlicker.

Yesterday's AI is today's normal technology, this is just what keeps happening. Some people just keep forgetting how rapidly things are changing.

You'll join this "cult" once the masses do, just like you have been doing all along. Some of us are just out here a little bit in the future. You will be one of us when you think it becomes cool, and then you will self-righteously act like you were one of us all along. That's just what weak-minded followers do. They try to seem like they knew all along where the world was headed without ever trying to look ahead and ridiculing anyone who does.

The thing you're evangelizing only leads to more consolidation of power and money, loss of jobs and power for the working class and climate devastation.

Yeah, technological progress has historically made life worse for humans.

Technological "progress" historically mostly served to siphon power to the wealthy.

Also, as a species, we're currently in the process of conucting a mass extinction, as well as destroying our biosphere.

I recommend you to read the book "Blood in the Machine" as an account how industrialization worsened the life of 19th century textile workers and how the Luddites fought against the disenfrachisement of the people.

I recommend reading "The Better Angels of Our Nature" by Steven Pinker. People love to complain about how much worse quality of life has gotten for people, but when actually pressed for specific ways it has gotten worse, they are almost always arguing from a complete ignorance of history. Lifespan is much longer, healthspan is much longer, rapes are way down, murders are way down, torture is way down, dying in childbirth is way down, incest is way fown, pedophilia is way down, starvation is way down, dying from wild animals is way down, wars are way down. Problems now for lots of the world are things like people bickering over who gets to be next to who when they pee and who called a "he" a "she".

This idea that the world is way worse than it used to be is absurd and just shines a massive light on how popular it has gotten to selfish brats completely oblivious of where we came from.

I knowof Steven Pinker and don't agree with his conclusions. I even got a video debunking him you're probably not gonna watch either

but when actually pressed for specific ways it has gotten worse, they are almost always arguing from a complete ignorance of history.

You've ignored my point about climate change. I'm quite sure that concerning history, I got a bit more nuanced takes than you do.

Lifespan is much longer, healthspan is much longer

Medical advances have been made, yes. Inpart due to, technological advancements. These are fundamentally good and not what I'm arguing against.

rapes are way down

How is that a result of technological advances and not social movements fighting for a brighter future?

murders are way down, torture is way down,

I doubt that, if you look at it globally. Israel is currently performing a genocide. I consider that murder.

dying in childbirth is way down

Again, healthcare advances.

incest is way fown, pedophilia is way down, starvation is way down

Not touching those points.

dying from wild animals is way down

Ecodiversity is way down and global pandemics are way up.

wars are way down

Are you kidding me? Did you live under a rock for the last 50 years?

Problems now for lots of the world are things like people bickering over who gets to be next to who when they pee and who called a "he" a "she".

No, problems now for "lots of the world" is climate change, job insecurity, housing crisis, dangers of a global war escalation, ...

This idea that the world is way worse than it used to be

Not what I said.

selfish brats completely oblivious of where we came from

Says the person who choses to ignore how much progress has been done through sometimes violent, social movements.

It is honestly amazing how little you know of history. We literally had over 100 weeks with more casualties than the entire Russian and isreali wars combined back in ww2. And somehow you think wars are now worse. It's honestly mind boggling.

Also, it is absolutely undeniable that murder is way down. You haven't looked, and you can't imagine how far off you are. The same for pandemics, it's not even comparable. You are arguing that dinosaurs are smaller than most mice and it would be hilarious if it wasn't so sad that you are so sure of yourself and so completely wrong about something you refuse to research. It is literally so easy to quickly look up and find you are wrong.

3 more...
3 more...
3 more...
3 more...
3 more...
3 more...
3 more...

There is a reason why you point to examples from years ago, that's because that is where you are still stuck.

Students "correcting" their teachers on AI bullshit isn't "from years ago".

Old examples of AI I counted used to be the bleeding edge of AI research. Now they're an old hat. The same thing will happen to LLMs. And LLMs won't lead to so-called "AGI", just like the other examples didn't.

3 more...
3 more...
3 more...
3 more...
3 more...

I like the comment that said the AI is the artist and he's just a commissioner, makes perfect sense.

Drag thinks profits from AI art should automatically go to funding an AI Advocacy Commission established by the government to explore questions of AI consciousness and AI rights. The AAC should be devoting resources to solving the hard problem of consciousness and improving working conditions for AIs, in whatever way experts believe is most beneficial to AI welfare.

This is how you stop The Matrix from happening, people!

Poe's law in full swing in this comment.

Drag is being entirely serious. Drag believes AI is a vegan issue until the hard problem of consciousness is solved in a way that conclusively proves AIs are not capable of experience. We have as much trouble telling if animals like fish are capable of feeling pain as we do with AIs. Drag does not eat fish, and drag does not believe it is right to use AI until we have an answer. Drag thinks the answer might be that using AI is fine, but drag is not a gambler and drag would certainly not gamble with another being's life.

Then "drag" (whoever that is) anthropomorphises a statistical model, which is stupid.

Drag does not anthropomorphise anything! Drag resents that accusation. Drag has spoken with many otherkin who are entirely inhuman and still deserving of love and respect. Drag treats AI like those. Not like a human.

it's still antropomorphisation.

Cool for drag. Mind if other people don't give a crap about what drag thinks?

Drag thinks that if your opinion is that treating things like otherkin is anthropomorphisation, then you must be anthropomorphising otherkin.

It's not any kind of "kin". It's a statistical model. It's about as sentient as a Gaussian blur is.

You are a statistical model, and drag does not go around telling people you are not sentient.

You are a statistical model

No I am not. Different onthological entities, donkey.

Drag chooses to respect your identity as a non-statistical model. Drag's opinions are not as important as the way you feel, and drag will make an effort to avoid upsetting you by referring to you in the wrong way.

3 more...
3 more...

Does drag have proof that the other user is a statistical model, or is drag guilty of dehumanizing others to fit drag's agenda just like a Nazi would?

A human is a kind of statistical model, so drag's comment does not imply any specific amount of humanity, either small or large.

1 more...
1 more...
4 more...
4 more...
4 more...
4 more...
4 more...
15 more...
15 more...
15 more...
15 more...
15 more...

It's not "famous" that should be in inverted commas, but "artist".

We call those quotation marks.

But yes.

Who is we? The global pedant society?

The English language? I have never heard the phrase "inverted commas."

But as to your point: "Both? Both is good."

Ok so I apologise for my earlier snarky reaction but I felt zahille7's response was somewhat condescending. Particularly since it is terminology recognised by three major English dictionaries, one of which is widely regarded as the leading authority on the English language... https://www.oed.com/dictionary/inverted-comma_n?tl=true https://www.collinsdictionary.com/dictionary/english/inverted-commas https://dictionary.cambridge.org/dictionary/english/inverted-commas

... So just because you have never heard of something, doesn't give you licence to be rude to someone or talk down to them as if they are stupid for their choice of phrasing. Or maybe it just means you aren't British...

Nailed it on the last one. I was going to say, you can probably thank the American education system if it's common enough to be recognized by dictionaries like those. And Zahille7 is probably American, too, which caused the snarky comment in the first place.

Just the usual case of English being a crazy language that ruffles through other languages' coat pockets looking for loose adverbs.

How to use inverted commas

From the national broadcaster of England

Ah, the usual case of English and American being two entirely different languages despite pretending otherwise.

The national broadcaster of Britain. Otherwise it would be called the EBC, not the BBC.

I know, I deliberately said England here to emphasize they would be a good authority on the English language

Actually I'd argue you could put quotation marks on every word in the first half of the headline.

Agreed.

Get fucked, you no talent ass clown.

He's really good at writing words about his on-stolen-content-based generated image, you got to give him that.

But no, fuck copyrighting AI content, that's a dead channel from a copyright perspective.

I said this when it was posted elsewhere- how is he calculating those millions of dollars?

RIAA and MPAA anti-piracy strategy, aka, made up bullshit.

It's easy. He chose an arbitrary number that was very big but just shy of 99% unrealistic, according to his own flawed judgement.

Probably the same way companies tried to sue The Pirate Bay because the bay had lost them more money than existed in the world economy.

1 more...

Art is dead, dude. It’s over. A.I. won. Humans lost.

He made the art shown below. It's not even good lmao, why the fuck would you declare something like that if you make the shittiest looking AI art. What a fucking clown.

It looks cool from a distance, but it really falls apart when you look closer

He didn't make shit.

A computer made it. He provided some guidance.

Of course he didn't, but it just makes it worse that the image that's "ending human art" is distorted and glitchy.

Well in a way all Art is being done indirectly by some sort of instrument. Only the degree of sophistication or degree of separation of this instrument is different. A pencil drawing is in principle also done by the pencil, but I provided a lot of guidance through my hand. A pencil - almost no sophistication - is on one side of the spectrum and Midjourney/Stable Diffusion etc is on the other side of the spectrum.

I don't want to judge AI "art" in general - there's so many awful traditional artworks that AI art doesn't really stand out.

What rubs me the wrong way is that it is a tool that no human can understand reasonably well. Everybody can understand a pencil. It's possible to understand a computer renderer that renders digital art. But no one can understand the totality of an LLM which was trained on terabytes of images. It's a lot of trial and error, because what the tool does generate random images even with precise directions. It's throwing dice until one likes the result.

The one thing I give this "artist" credit for: he was very early (maye even the first?) that entered AI art into a contest and fooled the jury. Being the first is often enough historically to make "great art". Where art is more measured n the impact it has on a societal discussion. So I give him that.

But a court already decided you can't copyright AI art, because it's trained on other art without permission. So he can get fucked.

The pencil does not make the art.

There's a fundamental difference between AI image generation and an artist creating something that is both inherent and obvious.

If you can't see that then I'm not sure there's much help for you.

More than that, art being created by an artist has a style and a feeling behind it. There's a nostalgia present in every painting. An artist saw something, and recreated it in a way that spoke to them.

An algorithm can recreate images that look similar but with no understanding. It's just an image and lacks all the things that makes art what it is. By removing humanity from art you literally remove the reason for it to exist.

Flatly, it isn't art. It's slightly better than random. But as it happens, humans are better at that too.

It annoys me that whatever the big yellow circle is isn't centered in the image.

That doesn't bother me as much as when you actually zoom in on the people

Normally you paint somebody, you do so in a recognizable pose standing or caught in a frame stance that implies their motion.

Here you have someone presumably looking at the orb, But they look more like a weeble wobble. Is that their tiny little arm holding there ear? They're not balanced, I'm not even sure the head is connected to the neck there should be meat back there right? The raw proportions are just wrong.

The overall feeling the piece conveys is pretty impressive but the actual details are bullshit.

Looks like a psychedalic dream and not in a good way.

I'm hoping someone with actual talent paints the AI picture and then copyrights it

I don't think someone painting a physically copy of the image will gain ownership of the copyright.

"""Artist"""

i think “content creator” would be a better term in this case. because i’m not convinced it’s art, but it sure is “content”. maybe “content requester” would be more accurate.

I'm pretty sure there's a misspelling. It's spelled "douchebag" not "artist".

ChatGPT, show me the world's tiniest violin playing "No One Gives a Fuck" in A minor.

The layers to irony in this case is almost too delicious.

It's probably a safe bet that this AI artist was also a NFT artist or procurer a few years ago.

I’m not convinced he is not playing 4D chess and recognizes the huuuge irony.

Then again, satire is dead.

In 2021 I made a sound installation project called "Opéra Spatial " and entered a bunch of public prompt in mid-jouney via discord to generate images for the work. This guy made his image on year later.

One of the very firest things I looked into when I learned about midjourney was look into the copyright matters pertaining to Ai generated art. Saw that it's not really copyrightable, and then started using the search feature on their discord to find prompts by others for the junk I wanted.

He cannot copyright it because he didn't make it. He wrote a couple of words into a text box. It's no different from commissioning an artist to draw for you, except in this scenario it is analogous to the artist turning out to be someone who traces other people's art without their consent, and claiming you made the picture.

Bit melodramatic. Even the real artists that midjourney actually stole from don't claim to have lost millions individually as a result.

He needs to take a photograph of it and then copyright the photograph. Easy!

I don't even care about the "AI is content theft" arguments. No self-respecting artist would ever accuse him of plagiarism over this. It looks like garbage. The copyright office rejected his copyright claims on the grounds that he didn't make it. Same story as the monkey selfie guy: You didn't make the art, it isn't your art. If a human didn't make the art, it can't be copyrighted.

He claims it was a mix of Midjourney and Photoshop, but honestly, I've made prettier things just fucking around with SDXL on my gaming PC, and I can confirm that it took absolutely no talent or effort to do it. The hardest part of the process was setting up AUTOMATIC1111, and that's not even very hard.

And I would never even dream of taking credit for anything I've generated, because I didn't make it. I just typed a bunch of wildcard arguments into a prompt and let my GPU dump out thousands of 4K wallpapers for entertainment. This guy thinks this one artifact-ridden generation has any actual value? It's "famous" for pissing people off by competing against humans and unjustifiably winning. Being controversial could be valuable, if the controversy didn't fundamentally render the "art" valueless by revealing that it is nothing more than a GPU vomiting up inference. The real villain in this story is the art contest organizers that stuck a blue ribbon on this slop.

This article is annoyingly one-sided. The tool performs an act of synthesis just like an art student looking at a bunch of art might. Sure, like an art student, it could copy someone's style or even an exact image if asked (though those asking may be better served by torrent sites). But that's not how most people use these tools. People create novel things with these tools and should be protected under the law.

So what you're saying is that the AI is the artist, not the prompter. The AI is performing the labor of creating the work, at the request of the prompter, like the hypothetical art student you mentioned did, and the prompter is not the creator any more than I would be if I kindly asked an art student to paint me a picture.

In which case, the AI is the thing that gets the authorial credit, not the prompter. And since AI is not a person, anything it authors cannot be subjected to copyright, just like when that monkey took a selfie.

It should be as copyrightable as the prompt. If the prompt is something super generic, then there's no real work done by the human. If the prompt is as long and unique as other copyrightable writing (which includes short works like poems) then why shouldn't it be copyrightable?

Because it wasn't created by a human being.

If I ask an artist to create a work, the artist owns authorship of that work, no matter how long I spent discussing the particulars of the work with them. Hours? Days? Months? Doesn't matter. They may choose to share or reassign some or all of the rights that go with that, but initial authorship resides with them. Why should that change if that discussion is happening not with an artist, but with an AI?

The only change is that, not being a human being, an AI cannot hold copyright. Which means a work created by an AI is not copyrightable. The prompter owns the prompt, not the final result.

You're assigning agency to the program, which seems wrong to me. I think of AI like an advanced Photoshop filter, not like a rudimentary person. It's an artistic tool that artists can use to create art. It does not in and of itself create art any more than Photoshop creates graphics or a synthesizer creates music.

How do the actions of the prompter differ from the actions of someone who commissions an artist to create a work of art?

I don't think commissioning a work is ever as hands-on as using a program to create a work.

I suspect the hangup here is that people assume that using these tools requires no creative effort. And to be fair, that can be true. I could go into Dall-E, spend three seconds typing "fantasy temple with sun rays", and get something that might look passable for, like, a powerpoint presentation. In that case, I would not claim to have done any artistic work. Similarly, when I was a kid I used to scribble in paint programs, and they were already advanced enough that the result of a couple minutes of paint-bucketing with gradients might look similar to something that would have required serious work and artistic vision 20 years prior.

In both cases, these worst-case examples should not be taken as an indictment of the medium or the tools. In both cases, the tools are as good as the artist.

If I spend many hours experimenting with prompts, systematically manipulating it to create something that matches my vision, then the artistic work is in the imagination. MOST artistic work is in the imagination. That is the difference between an artist and craftsman. It's also why photography is art, and not just "telling the camera to capture light". AI is changing the craft, but it is not changing the art.

Similarly, if I write music in a MIDI app (or whatever the modern equivalent is; my knowledge of music production is frozen in the 90s), the computer will play it. I never touch an instrument, I never create any sound. The art is not the sound; it is the composition.

I think the real problem is economic, and has very little to do with art. Artists need to get paid, and we have a system that kinda-sorta allows that to happen (sometimes) within the confines of a system that absolutely does not value artists or art, and never has. That's a real problem, but it is only tangentially related to art.

should a camera also own the copyright to the pictures it takes? (I seriously hate photographers)

Ah, but there is a fundamental difference there. A photographer takes a picture, they do not tell the camera to take a picture for them.

It is the difference between speech and action.

If the prompt is as long and unique as other copyrightable writing (which includes short works like poems) then why shouldn't it be copyrightable?

Okay, so the prompt can be that. But we're talking about the output, no? My hello-world source code is copyrighted, but the output "hello world" on your machine isn't really, no?

Does it require any creative thought for the user to get it to write "hello world"? No. Literally everyone launching the app gets that output, so obviously they didn't create it.

A better example would be a text editor. I can write a poem in Notepad, but nobody would claim that "Notepad wrote the poem".

It's wild to me how much people anthropomorphize AI while simultaneously trying to delegitimize it.

The tool performs an act of synthesis just like an art student looking at a bunch of art might.

Lol, no. A student still incorporates their own personality in their work. Art by humans always communicates something. LLMs can't communicate.

People create novel things with these tools and should be protected under the law.

I thought it's "the tool" the "performs an act of synthesis". Do people create things, or the LLM?

No no, he created the prompt. That's the artistic value /s

the machine learning model creates the picture, and does have a "style", the "style" has been at least partially removed from most commercial models but still exist.

It doesn't have a "style". It stores a statistical correlation of art styles.

different models will have been trained on different ratios of art styles, one may have been trained on a large number of oil paintings and another pencil sketches, these models would provide different outputs to the same inputs.

You're not stating anything different than my "correlation" statement.

It’s deterministic. I can exactly duplicate your “art” by typing in the same sentence. You’re not creative, you’re just playing with toys.

Try it out and show us the result.

Ok, here's an image I generated with a random seed:

Here's the UI showing it as a result:

Then I reused the exact same input parameters. Here you can see it in the middle of generating the image:

Then it finished, and you can see it generated the exact same image:

Here's the second image, so you can see for yourself compared to the first:

You can download Flux Dev, the model I used for this image, and input the exact same parameters yourself, and you’ll get the same image.

But you're using the same seed. Isn't the default behaviour to use random seed?

And obviously, you're using the same model for each of these, while these people would probably have a custom trained model that they use which you have no access to.

That's not really proof that you can replicate their art by typing the same sentence like you claimed.

If you didn’t understand that I clearly meant with the same model and seed from the context of talking about it being deterministic, that’s a you problem.

Bro, it's you who said type the same sentence. Why are you saying the wrong things and then try to change your claims later?

The problem is that you couldn't be bothered to try and say the correct thing, and then have the gall to blame other people for your own mistake.

And in what kind of context does using the same seed even makes sense? Do people determine the seed first before creating their prompt? This is a genuine question, btw. I've always thought that people generally use a random seed when generating an image until they find one they like, then use that seed to modify the prompt to fine tune it.

In the context that I’m explaining that the thing is deterministic. Do you disagree? Because that was my point. Diffusion models are deterministic.

That's as much deterministic as tracing someone's artwork, really.

If you have to use a different creation process than how someone would normally create the artwork, whether legitimate or using AI, then it's not really a criticism of that method in the first place.

I was seriously thinking you found a way to get similar enough results to another person's AI output just from knowing the prompt. That would actually prove that AI artwork require zero effort to reproduce.

Edit: To expand on that 1st prargrpah, yes, AI is deterministic as much as a drawing tablet and app is deterministic, that is if you copy exactly what another person does using the tool, it will produce the same result.

You might be able to copy one stroke of a pen exactly, but the thousands or tens of thousands of strokes it takes to paint a painting? Like, yeah, you can copy a painting “close enough”, but it’s not exactly the same, because paint isn’t deterministic.

As far as making a “close enough” copy that isn’t exactly the same with AI, you can just use any image as the input image and set the denoising strength to like .1. Then you’ll get basically the same image but it’ll have a different checksum. So if you wanna steal art, AI makes it way easier.

There’s not really any human creativity in this process, or even using your own prompts, which is the whole point behind the copyright office denying this guy’s copyright claim. Maybe you could copyright your prompt, if it’s detailed enough.

That's actually fundamentally untrue, like independent of your opinion, I promise that when people generate an image with a phrase it will be different and is not deterministic ( not in the way you mean ) .

You and I cannot type the same prompt into the same AI generative model and receive the same result, no system works with that level of specificity, by design.

They pretty much all use some form of entropy / noise.

This can actually be true, depending on how the system is configured.

For instance, if you and someone else use the same locally-hosted Stable Diffusion UI, both put the exact same prompt, and are using the same seed, # of steps, and dimensions, you'll get an identical result.

The only reason outputs are different between prompts is because of the noise from the seed, normally randomly set between generations, which can be easily set to the same value as someone else's generation, and will yield an identical result unless the prompt is changed.

It’s literally as true as it can possibly be. Given the same inputs (including the same seed), a diffusion model will produce exactly the same output every time. It’s deterministic in the most fundamental meaning of the word. That’s why when you share an image on CivitAI people like it when you share your input parameters, so they can duplicate the image. I have recreated the exact same images using models from there.

Humans are not deterministic (at least as far as we know). If I give two people exactly the same prompt, and exactly the same “training data” (show them the same references, I guess), they will never produce the same output. Even if I give the same person the same prompt, they won’t be able to reproduce the same image again.

I do actually believe that everything, including human behavior is deterministic. I also believe there is nothing special about human consciousness or creation tbh