Rule elitism

ZILtoid1991@lemmy.world to 196@lemmy.blahaj.zone – 368 points –
70

The only reason you care is because you've been conditioned to attack anything that could harm your income-potential.

Instead of fighting "AI", how about we fight for a world where artists don't have to monetarily justify their existence?

Or maybe artist should be able to not justify their existence monetarily and also not have their art fucking stolen and murdered to generate terrible pseudo art lmao.

This. Art is expression. Wtf is AI art expressing?

Whatever the artist using the AI tool is trying to express?

An AI doesn't understand what human emotion is.

Does a paintbrush?

This is a false equivalency. Unless the paintbrush is stolen I guess. 🙄

That's unprovable without some very strict definitions, but if we take it as a given (and for the record I don't disagree, so we should) then that's why the ai isn't the artist. It's just a tool an artist could use. MS Paint isn't an artist either, and like ai neither are many of the people using it, but it still can be used to create art.

Meh, better approach it to assume it doesn't understand emotion unless proven otherwise. Does a fork understand what human emotion is? A pillow? You wouldn't assume that either I guess.

So which of us are p-zombies? We've encountered the same problem by suggesting that human beings have consciousness or self awareness, or get what qualia are, except we can't prove that anyone has any of these things. The difference of AI consciousness within its development community is a sorites paradox. Large AI packages like GPT-4 have more awareness than previous versions, but not as much awareness as humans. But it Chat GPT4 does exceed human control subjects in the Turing test.

Mind you the Turing is only one of several tests we use to rate how advanced AI is, but we can't be sure even when an AI can make coffee given a machine and supplies, and construct flat-packed furniture given the IKEA visual instructions, that this counts as AGI, or is sentient.

Right now, there are artists who use generative AI to create art, and it is as much really art as photography was really art when illustrators were complaining they are just using a machine to replicate a real scene. As much as music production and music synthesis are art.

Now yes, I get that AI presents risks of workers losing income and their capacity to survive, but every time we toss our sabots into the gearworks to break the machines, we're kicking overthrow of the system down the line, until we're where we are today, not only looking at the dissolution of our democracy so that industrialists may continue to exist, but also the destruction of our habitat, because we can't address what makes them money.

So capitalism is going to end you either way, unless you end it first. And I expect if you actually tried to make a fortune on your art, you would eventually find yourself selling out all your rights to one of the big corporate controllers, and they would own everything you did, and pay you a pittance for it... Unless you are James Hetfield kind of skilled and lucky. Somehow I doubt you are.

What about a cat, or a person who's different from you? It's just as impossible to prove, and yet...

Welcome to radical constructivism :) The question whether other people or cats can experience emotions is in fact a problem people have been thinking about quite a lot. Answers are not very satisfactory, but one way to think about it (e.g., some constructivists would do that) is that assuming they do have a conscience simplifies your world model. In the case of "AI" though, we have good alternative explanations for their behavior and don't need to assume they can experience anything.

The other important bit is that not assuming some phenomenon exists (e.g., "AI" can experience emotions) unless proven otherwise is the basis of modern (positivistic) science.

assuming they do have a conscience simplifies your world model.

Does it? Feels more like it merely excludes them from your model, since your model cannot explain their conscience. If that simplifies your model, then you can apply the same thinking to anything you don't understand by simply saying it is similar to something else you also can't explain.

The other important bit is that not assuming some phenomenon exists (e.g., "AI" can experience emotions) unless proven otherwise

The problem with this isn't that it's literally unprovable, it's that proving it requires defining "can experience emotions" in a way everyone can agree on. Most trivial definitions that include everything we think ought obviously be included often bring in many things we often think ought be excluded, and many complicated definitions that prune out the things we think ought be excluded, often also cut out things we think should be included

Does it?

Yes, in the sense that "thing moves around and does stuff" becomes more predictable if you assume a certain degree of consciousness. This is easier than "thing is at this position now, was at a different position before, was at yet another position before that". You reduce some of the complexity and unpredictability by introducing an explanation for these changes of world state. In my world, so far I worked well with the assumption that other humans and animals have some consciousness and at least I'm not aware of any striking evidence that would raise doubt on that.

The problem with this isn’t that it’s literally unprovable

Yes, that's a problem, but it's relatively similar to the other one. It's actually quite hard to "prove" anything with real world connection. However, in the case of other humans/animal consciousness, evidence suggests so (at least for me). The evidence in the case of "AI" is different, though. For example, they seem to have no awareness of time and no awareness of the world beyond the limited context of a conversation. Besides a fancy marketing term that suggests there is something similar to living beings involved, what we currently see are admittedly impressive programs that run on statistics, but I don't need to assume any "consciousness" to explain what they do.

You reduce some of the complexity and unpredictability by introducing an explanation for these changes of world state

My concern is that "consciousness" isn't so much an explanation as it is a sort of heuristic. We feel conscious and have an internal experience, so it seems pretty reasonable to say that such a thing exists, but other than one's own self, there's no point where it is certain to exist, and no clear criteria or mechanism that we can point to.

What about the p-zombie, the human person who just doesn't have an internal experience and just had a set of rules, but acts like every other human? What about a cat, who apparently has a less complex internal experience, but seems to act like we'd expect if it has something like that? What about a tick, or a louse? What about a water bear? A tree? A paramecium? A bacteria? A computer program?

There's a continuum one could construct that includes all those things and ranks them by how similar their behaviors are to ours, and calls the things close to us conscious and the things farther away not, but the line is ever going to be fuzzy. There's no categorical difference that separates one end of the spectrum from the other, it's just about picking where to put the line.

And yes, we have perhaps a better understanding of the mechanism behind how an ai gets from input to output than we do for a human, but it's not quite a complete one. And the mechanism for how humans get from an input to an output is similarly partially understood. We can see how the arrangement and function of nerve cells in a "brain" lead to the behaviors we see, and have even fully simulated the brains of some organisms with machine code. This is not so dissimilar from how a computational neural network is operated. The categorical difference of "well one is a computer" doesn't work when we have literally simulated an organic brain also on a computer.

1 more...
1 more...
1 more...
1 more...
1 more...
1 more...
1 more...
1 more...

If I ask a painter to paint a landscape, who's making art, me or the painter?

Is the painter just a tool?

You can't really have it both ways.

Is the things just a machine that's following instructions and synthesizing its training data into different things? Then it's a tool.

Is the things making choices and interpreting your inputs to produce a result? Then it's an artist.

The painter I buy a commission from is an artist. The ai I use to generate a scene is a tool.

Was the "training data" produced by artists or tools?

I mean, yes?

That's very pithy, but the material used as training data was probably produced by artists attempting to create art using tools (ai and otherwise), as well as more mundane data designed and produced by humans with no ai tools and some produced by humans with almost exclusively ai tools.

You probably live in a different world than I do.

Don't chicken/egg this. All of the training data was man-made at some point. Until the first LLMs started outputting based on it.

Secondly, the amount of human-produced content and LLM-produced content that's in the training data is incomparable. And will continue to be so. Otherwise the models break.

1 more...

Art is aesthetic. It can express something but it doesn't have to.

Not necessarily. Not all art is visual, not all art is made to create pleasing emosion. Not all is made to be beautiful.

It doesn't have to be a pleasant aesthetic or visual. It could be anything from a full image to a font used on a resume to the choice of words used in general to the way the email address sounds if you pronounce it out loud. It can be the sequence of smells, sounds, sights, taste, and feel of a single course or five course meal.

It can be puppets designed to last generations or an explosion that exists for a brief moment.

It can even be the cleverness of how a message is woven into an otherwise meaningless looking scene.

1 more...

Terrible pseudo art is what you get from hollywood and big music studios right now, for the most part.

Nice of you to think AI won't just lead to even more formulaic stuff, but maybe from different megacorporations.

All those poor OCs, dead forever! No human can ever draw them again now that AI ate them!

1 more...

The only reason you care is because you’ve been conditioned to attack anything that could harm your income-potential.

My ‘favorite’ is the argument that replacing jobs is what technology is meant to do.

This isn’t just a job. If I won the lotto tomorrow, if I had billions and billions of dollars and never had to make another cent in my life, I would still be writing. Art is not just a production, it is a form of communication, between artist and audience, even if you never see them.

Writing has always been something like tossing a message in a bottle into a sea of bottles and hoping someone reads it. Even if the arguments that AI can never replace human writing in terms of quality is true, we’re still drowned out by the noise of it.

It really revs up the ol’ doomer instinct in me.

Even if the arguments that AI can never replace human writing in terms of quality is true, we’re still drowned out by the noise of it.

It's a good point but honestly the internet is mostly just noise and it's not a problem we're going to solve. It's something we have to learn to live with. If you take more than a passing interest in an art, you should be able to find an island in the ocean of noise with like-minded people.

The issue is, the goal of AI companies and AI believers is the replacement of artists. They see more traditional forms of art as obsolete. Especially the AI believers, who do everything they can (including trying to generate fake art progresses, and entering art contests with AI generated images) to "disrupt the art community".

The only way this could work is communism, but I don't think tech bros will call for that any time soon.

1 more...

You can't "defeat" AI. It's not an organization or a group to fight against; it's technological progress.

The weavers' uprisings didn't stop the Industrial Revolution either. AI is a tool, and those who learn to handle and adapt to it the fastest will be the ones who fare the best.

The weavers fought the capitalist system not industrial progress with the weaving machines being the easiest target as the where expensive to build, but highly profitable.

Same here: The goal is the overcoming of capitalism but until then we can annoy them by messing with their new toys.

People said the same about NFTs, and they suddenly disappeared...

With enough "bullying", we can force the genie back into its lamp. We just need to learn more from anti-GMO karens.

It's not even a matter of bullying: NFTs disappeared because they were fundamentally not viable, and there's a good chance that generative AI is also not viable.

Generating an output is extremely computationally expensive, which is a problem because you need several attempts to get an acceptable output (at least in terms of images). This service can't stay free or cheap forever, and once it starts being expensive, that's also a problem in itself since generative AI is most suited to generate large amounts of low-profit content.

For example, earlier this month, Deviantart highlighted a creator that they claimed to be one of their highest earners; they made $25k "in less than a year", which is not much for the highest earner, and they did it by posting over NINE THOUSAND images in that time. They were selling exlusives for less than $10.

The only way this makes sense is if it's really cheap to generate that many images. Even a moderate price, multiplied by 9000, multiplied by the number of attempts each time, would have destroyed their already middling profit.

NFTs were a straight-up scam. The wire mother of wire fraud.

The only reason crypto almost-sorta-kinda works is that nearly anything works as a medium of exchange so long as it's fungible. Which is the F those Ts were N.

After 1945, they made sure no one is denied from art school anymore

The thing I hate most about this artist is that he can actually be funny sometimes, when he's not being a bigoted troll. Which is most of the time.

Like, how dare that asshole make me laugh at something he made?

Absolutely.

I didn't even notice, then in my panic I replied to my own comment instead of editing.

I'll leave the whole mess up now - mea culpa.

He can be really subtle in his messaging, too.

Like, reading this comic as a normal person, I see a ha-ha funny joke about the robot doing a hitler. Why does being rejected from art school make him do this? Uh... I don't know. It's just a reflection on an old story, don't think too much about it.

Viewing this comic through the lense of a nazi, however, doesn't it seem a little bit like a call to action? As if it's excusing the violence the nazi robot will engage in as a kind of justice for people's dismissal of AI art?

The only thing you have to do to reach that second conclusion is not believe that the nazi outcome is a bad one.

Anyway, I dunno. I'll never know what Stonetoss really meant, I just think it's interesting.

Edit: I don't normally like stonetoss. Can I have my comment back?

Nobody has been able to make a convincing argument in favour of generative AI. Sure, it's a tool for creating art. It abstracts the art making process away so that the barrier to entry is low enough that anyone can use it regardless of skill. A lot of people have used these arguments to argue for these tools, and some artists argue that because it takes no skill it is bad. I think that's beside the point. These models have been trained on data that is, in my opinion, both unethical and unlawful. They have not been able to conclusively demonstrate that the data was acquired and used in line with copyright law. That leads to the second, more powerful argument: they are using the labour of artists without any form of compensation, recognition, permission, or credit.

If, somehow, the tools could come up with their own styles and ideas then it should be perfectly fine to use them. But until that happens (it won't, nobody will see unintended changes in AI as anything other than mistakes because it has no demonstrable intent) use of a generative AI should be seen as plagiarism or copyright infringement.

So, how do art students learn? They are doing the exact same things. Only they do a lot less, because natural neural networks (aka brains) are not capable of processing training data as quickly. It's not as if every artist has to reinvent the wheel and generative AIs don't and as such have an unfair advantage.

Look at inventions like the printing press! Did everybody like it? The catholic church certainly didn't! Is it a a phantastic peace of technology anyway? Sure is!

Students learn techniques that they apply to their own personal style. The goal of art school isn't to create a legion of artists that can churn out identical art, it's to give young creatives the tools they need to realize the ideas in their head.

AI has no ideas in it's head. Instead, it takes in a bunch of an artists work, and then produces something that does it's best to match the plagiarized artist's style exactly.

We don't know if there's much of a difference between an AI and a brain. If we look "into" an artificial neural net we only see a lot of "weights" that don't make any sense. If we look into a brain, we also can't make any more sense of it. The difference is smaller than we want it to be because it seems to either give AIs a lot of credit or it makes us less than we want to be. We are very biased, don't forget that.

How does copyright law cover this?

Copyright gives the copyright holder exclusive rights to modify the work, to use the work for commercial purposes, and attribution rights. The use of a work as training data constitutes using a work for commercial purposes since the companies building these models are distributing licencing them for profit. I think it would be a marginal argument to say that the output of these models constitutes copyright infringement on the basis of modification, but worth arguing nonetheless. Copyright does only protect a work up to a certain, indefinable amount of modification, but some of the outputs would certainly constitute infringement in any other situation. And these AI companies would probably find it nigh impossible to disclose specifically who the data came from.

Copyright gives the copyright holder exclusive rights to modify the work, to use the work for commercial purposes, and attribution rights.

Copyright remains a system of abuse that empowers large companies to restrict artistic development than it does encourage artists. Besides which, you're failing to consider transformative work.

As it is, companies like Disney, Time Warner and Sony have so much control over IP that artists can't make significant profit without being controlled by those companies, and then only a few don't get screwed over.

There are a lot of valid criticisms about AI, but the notion that training them on work gated by IP law is not one of them... Unless you mean to also say that human beings cannot experience the work either.

Training AI pretty much falls under fair use, as far as I'm aware... https://www.eff.org/deeplinks/2023/04/how-we-think-about-copyright-and-ai-art-0

Rare EFF L.

Well, current law is not written with AI in mind, so what current law says about the legality of AI doesn't reflect its morality or how we should regulate it in the future

The law actually does a better job than you'd think. While it says little about stealing work to train neural networks, it does say that the final result requires significant human input to be eligible for copyright. It's the same precedent that prevents the copyright of a selfie taken by a monkey. Non human intelligences can't own stuff, and AI art isn't made by a human intelligence, so it's all public domain right now. It cannot be stolen unless someone has put in significantly more work on top of it.

I was talking more about whether the existence of an image AI, regardless of the images it generates, breaks copyright law because of how it was trained on copyrighted images

EFF does some good stuff elsewhere, but I don't buy this. You can't just break this problem down to small steps and then show for each step how this is fine when considered in isolation, while ignoring the overall effects. Simple example from a different area to make the case (came up with this in 2 minutes so it's not perfect, but you can craft this out better):

Step 1: Writing an exploit is not a problem, because it's necessary that e.g., security researchers can do that.

Step 2: Sending a database request is not a problem, because if we forbid it the whole internet will break

Step 3: Receiving freely available data from a database is not a problem, because otherwise the internet will break

Conclusion: We can't say that hacking into someone else's database is a problem.

What is especially telling about the "AI" "art" case: The major companies in the field are massively restrictive about copyright elsewhere, as long as it's the product of their own valuable time (or stuff they bought). But if it's someone else's work, apparently it's not so important to consider their take on copyright, because it's freely available online so "it's their own fault to upload it lol".

Another issue is the chilling effect: I for one have become more cautious sharing some of my work on the internet, specifically because I don't want it to be fed into "AI"s. I want to share it with other humans, but not with exploitative corporations. Do you know a way for me to achieve this goal (sharing with humans but not "AI") in today's internet? I don't see a solution currently. So the EFF's take on this prevents people (me) from freely sharing their stuff with everyone, which would otherwise be something they would encourage and I would like to do.

Unions would probably work, as long as you get some people the company doesn't want to replace in there too

Maybe also federal regulations, although would probably just slow it because models are being made all around the world, including places like Russia and China that the US and EU don't have legal influence over

Also, it might be just me, but it feels like generative AI progress has really slowed, it almost feels like we're approaching the point where we've squeezed the most out of the hardware we have and now we just have to wait for the hardware to get better

the true chad is on the left.

Hyper-realistic is the least interesting style imo, we have cameras for that

I don't see a problem with Generative AI, because it's just going to be a great tool for companies to add graphics real fast to their products. I don't see it replacing regular art, since "AI art" is just a natural progression for endless content that you can already scroll on social media when you are bored.

If the rise of LLMs to the mainstream has taught me anything, it's that artists are very whiny bunch who vastly over value themselves.

For me, it shown how vile people can be to others, just because they're not injuring themselves during work.

Image AI is cool as fuck and I refuse to pretend otherwise.

The dolts charging money for "commissions," or even just bragging about something they allegedly created, will be a blip for these few short years. The tech will become another tool anyone can use so long as they have a few free gigabytes... which is already a bit like saying, so long as they have a few free megabytes.

If there's any unavoidable AI tells, when someone sketches an image but has Photoshop finish it for them, we'll come to spot those as readily as we spot gradient fill or the oil-paint filter. They'll be a sign someone did some, but not all, of the hard work. Big whoop.

If there are no tells - if more training lets an overworked graphics card churn out exactly what you describe, as surely as a human artist might - that's gonna be fantastic for expression. And it won't prevent anyone from painting with actual brushes and canvas.

The video versions of this stupid GPU trick will allow any weirdo to turn their original script or sordid fanfiction into an actual movie you can watch with your eyeballs. The characters don't need to look or sound like any specific real actor. It's gonna get wild.