Maybe AI won't be taking all of our jobs after all?

Flying Squid@lemmy.worldmod to Lemmy Shitpost@lemmy.world – 1425 points –
163

And now for the secret ingredient... fear.

I see we are using the Gordon Ramsay cookbook today. I prefer Justin Wilson, where the secret ingredient is wine for the cook

Personally, I prefer Alton Brown, where the secret ingredient is goofiness.

There's an episode of Are You Afraid of the Dark from old school Nickelodeon (1992 to 1996) that had this as the exact premise.

Gotta have fear

You should use all-purpose fear, bread fear will make the cake tough.

I like how it just explicitly refuses to label the butger bowls

Because deep down he knows that is butter not butger.

Fear is my favourite type of egg to use in cakes.

Yeah here I was using love. Never thought of fear. I imagine it to have an essence of piss of the pants.

I don't know, it does recommend eggs ol like all master chiefs.

I wouldn't use Cugar in your carrot cake. It will make it too savory.

But throwing in some extra walnut with three different labels is perfectly fine though.

1 more...

Every time I see posts like this I remember a frequent argument I had in the early 2000's.

Every time I talked with photography students (I worked at an art school) or a general photography enthusiast, I got the same smug predictions about digital photography. The resolution sucked, the color sucked, the artist doesn't have enough control, etc. They all assured me that digital photography might be nice for casual vacation photos and maybe a few specialty applications but no way, no how, not even when hell freezes over would any serious photographer ever consider digital.

At the time I would think back to my annoying grade school discussions with teachers who assured me that (dot matrix) printers just sucked. Serious writing was done by hand and if you didn't know cursive you might as well be illiterate.

For some reasons people keep forgetting that technology marches on. The dumb glitches that are so easy to make fun of now, will get addressed. There are billions of dollars pouring into AI development. Every major company and country is developing them. The pay rates for AI developer jobs attract huge amounts of people to solve those problems.

And up to now we have zero indication that the current approach isn't a dead end. Bill Gates, for instance, thinks that GPT-4 is a development plateau: https://heise.de/-9337989

Ah, yes, famous expert in artificial intelligence and machine learning, Bill Gates. I'm personally curious what Taylor Swift thinks about Chat GPT 5, myself. That girl's got a lot of money, which means she must be smart and has smart opinions on topics like generative AI and the efficacy of currently undeveloped LLMs.

We have plenty of indication, when we look at past technologies that plenty thought to have plateaued still being improved.

Didn't Bill Gates think spam would been a thing of the past ... in 2006.

My junk folder disagrees.

Certainly, but none of those technologies completely replaced things. The existing way of doing things became hobbies and remain the preference over the technology which disrupted the field.

Not to mention, technologies will sometimes flop, only to resurface later in a completely different package. The PDA was maybe popular for a year? But now we all have smartphones which effectively capture that concept. The Wii U failed, but the Switch has been wildly popular.

It's probably premature to say that AI will completely fail, but also that AI will completely replace everything. I just used a Polaroid camera this past weekend at a wedding, and it was enjoyable in a way digital cameras or phones wouldn't have been. I still write things out at work, particularly if I'm trying to wrap my head around some math or a difficult concept. Typing it out doesn't work as well.

I think it is safe to say that there are some things AI will never be able to replace, just like there are some things digital cameras couldn't replace, nor our phones.

There's either the "it'll never work" take or the "it'll destroy the industry!" take, and both are kinda childish. New technologies are tools, nothing more, nothing less. Learn to use them and they'll make your life easier. Integrate them if they're threatening your livelihood. Learn and adapt, it's how progress has always worked.

Same. Remember the same arguments. Heck I still get into it with clients sometimes. Usually snark works

Me: wasn't 2013 nice? I had a full set of hair and didn't have to diet, but as much as I might miss 2013 it isn't 2013 anymore. Time to move forward.

I'm guessing this argument has been going on longer than either of us can remember.

There was a long time when guns were considered interesting toys but not something a sane person would take onto the battlefield; especially not without some sort of backup. Hell, the "three musketeers" were more known for their fencing than their firearms skill.

I'm sure back in the day some chucklehead complained that papyrus was cute but anything important had to be carved in stone tablet.

It's shocking to me how many people try to make cakes without fear. It just doesn't taste the same without it

Last time I tried that I burned my entire kitchen down

Without fear ... all you have is drywall compound and a great mix to patch a few holes in the wall.

Why is it even called artificial intelligence, when itΒ΄s obviously just mindless artificial pattern reproduction?

Because that's what intelligence is. There's a very funny video floating around of a squirrel repeatedly trying to bury an acorn in a dog's fur and completely failing to understand why it's not working. Now sure, a squirrel is not the smartest animal in the world, but it does have some intelligence, and yet there it is just mindlessly reproducing a pattern in the wrong context. Maybe you're thinking that humans aren't like that, that we make decisions by actually thinking through our actions and their consequences instead of just repeating learned patterns. I put it to you that if that were the case, we wouldn't still be dealing with the same problems that have been plaguing us for millennia.

Machine Learning is such a better name. It describes what is happening - a machine is learning to do some specific thing. In this case to take text and output pictures... It's limited by what it learned from. It learned from arrays of numbers representing colours of pixels, and from strings of text. It doesn't know what that text means, it just knows how to translate it into arrays of numbers... There is no intelligence, only limited learning.

Are we so different?

Isn't meaning just comparing and contracting similarly learned patterns against each other and saying "this is not all of those other things".?

The closer you scrutinize meaning the fuzzier it gets. Linguistically at least, though now that I think about it I suppose the same holds true in science as well.

Yes, we absolutely are different. Okay, maybe if you really boil down every little process our brains do there are similarities, we do also do pattern recognition, yes. But that isn't all we do, or all ML systems do, either. I think you're selling yourself short if you think you're just recognising patterns!

The simplest difference between us and ML systems was pointed out by another commenter - they are trained on a dataset and then they remain static. We constantly re-evaluate old information, take in new information, and formulate new thoughts and change our minds.

We are able to perceive in ways that computers just can't - they can't understand what a smell is because they cannot smell, they can't understand what it is to see in the way that we do because when they process images it is exactly the same to a computer as processing any other series of numbers. They do not have abstract concepts to relate recognised patterns to. Generative AI is unable to be truly creative in the way that we can, because it doesn't have an imagination, it is replicating based on its inputs. Although, again, people on the internet love to say "that's what artists do", I think it's pretty obvious that we wouldn't have art in the way we do today if that was true... We would still be painting on the walls of caves.

Machine Learning isn't a good name for these services because they aren't learning. You don't teach them by interacting with them. The developers did the teaching and the machine did the learning before you ever opened the browser window. You're interacting with the result of learning, not with the learning.

AI is also the minmax algorithm for solving tic-tac-toe, and the ghosts that chase Pac-Man around. It's a broad term. It doesn't always have to mean "mindblowing super-intelligence that surpasses humans in every conceivable way". So it makes mistakes - therefore it's not "intelligent" in some way?

A lot of the latest thought in cognitive science couches human cognition in similar terms to pattern recognition - some of the latest theories are known as "predictive processing", "embodied cognition", and "4E cognition" if you want to look them up.

Honestly I think it's marketing ai sells better then machine learning programs

Because it is intelligent enough to find and reproduce patters. Kind of like humans.

But it is artificial.

Well, I think it comes down to a fundamental belief on consciousness. If you're non religious, you probably think that consciousness is a purely biological and understandable process. This is complete understandable and should be replicable. Therefore, artificial intelligence. But it's hard as dong to do well.

Why the hell are you being downvoted? I thought Lemmy had no religious fundamentalists or spiritualists

I guess the same reason why smartphones are called "smart" phones.

1 more...

I got potper and vedk but what's the 2 yellow cube please help it's urgent

We haven't made AI, we've just put shinier stickers on a god damn Chinese Room

Very true, that's exactly how it feels to me.

This reminds me of my last and final salvia trip. I fell through a black hole (my black papasan chair) and landed in the Steamboat Willie cartoon, except the colors were black, white, and evil. Then a giant red theater curtain fell on me, and I was back in reality, except everything was made of squares that were rotating on their vertical axis.

Turns out that AI can't make cakes or do recipes lol

ChatGPT can totally make a cake recipe tho, without even telling you about how their great uncle used to make that recipe when they would come back for holidays

No, Dall-e 3, an AI known for creating brilliant pictures, but isn't trained on general intelligence nor cooking, is the one creating bad recipes.

If I had a penny for every "this new stuff will never work" or "this new stuff will never take off "and then it did, I could probably afford nice dinner.

This "ai" isn't going away, there's too much profit to chase and corporate knows it.

I mean it's not even new. The last decade we mostly just called it machine learning and this type of work has been going on longer than that. It only feels new because we're putting generative tools in people's hands. It's already proven itself, it's already replacing jobs.

Truth. The generative AI stuff is just slightly better, slightly more accessible machine learning technologies trained on massive amounts of public internet data.

About the jobs part though, it's also creating a lot of jobs. Will it be more or less jobs long term, it's hard to say. However, machine learning right now creates a ton of high paying technical jobs.

I can say something similar, if I had a penny for every time the next big thing was pronounced and hyped, but didn't take off, I could probably afford a Rivian.

"Everything that can be invented has been invented" - Charles H. Duell, 1899.

[the origins of that quote are a bit sketchy but I'll just ignore that cause its a funny quote]

Some tech bro will attempt to make this cake and will tell someone it was better than anything some uppity WOKE human baker could have made, regardless of how bad it turned out.

Everything is labeled butter except the butter. I feel that there is an "I can't believe it's not butter" joke in there.

Vedk is something my grandmother used to add into her pound cake when I was a kid. Really set itself apart from the rest!

My spouse is now mocking me because I giggled in front of a picture of ingredients. "Ooooh funny... ingredients.... oh .. oh .. oh..."

not yet right now,

hilarious by those outputs.

But still, sooner maybe, as long as AI keeps learning and improving and advancing, it may come true sadly.

There are certain ways in which the AI seems to plateau short of the desired usefulness in some ways. If the output is supposed to be verbatim specific to the point a machine could consume it, it tends to have these sorts of mistakes. It shines at doing a better job in ingesting direction and data that is 'messy' and processing it and spitting it back at a human for a human to similarly have to be capable of correcting 'messy' input, though with it doing a lot of the heavy lifting to be influenced by more data.

This is a huge chunk of huge utility, but for very specific things and things that really require expertise, there seems to be an enduring need for human review. However, the volume of people needed may be dramatically reduced.

Similar to other areas of computing. Back in the 1930s, an animated film might take 1500 to make it happen. Now fewer than a tenth of that could easily turn out better quality with all the computer assistance. We've grown our ambitions and to make a similarly 'blockbuster' scale production which is insanely more detailed and thoroughly animated we still are talking about less than half the staff.

It seems that AI will be another step in that direction of reducing required staff to acheive better than current expectations, but not entirely displacing them. Becomes extra challenging, since the impacts may be felt across so many industries all at once, with no clear area to soak up the potential reduced labor demand.

I think LLMs will be fundamentally unable to do certain things because of how they function. They're only as good as what they're trained on, and with how fast and loose companies have been with that, they'll have formed patterns based on incorrect information.

Fundamentally, it doesn't know what it generates. If I ask for a citation, it'll likely give me a result that seems legit, but doesn't actually exist.

ah yes, using a highly specialized AI intended for image generation to create something that is usually in text form. truly a good judge of the quality of AI...

As far as I see it, the AI decided to label the ingredients

1 more...