AI bots hallucinate software packages and devs download them

db0@lemmy.dbzer0.com to Technology@lemmy.world – 399 points –
AI bots hallucinate software packages and devs download them
theregister.com
110

You are viewing a single comment

Wow, clever. Did you literally hallucinate this yourself or did you ask your LLM girlfriend for help?

And by literally, I mean figuratively.

You're gonna be real pissed to find out that computer bugs aren't literal bugs

Well, until a moth gets into your relays, anyhow.

I know it's a big word, but surely you can google what anthropomorphization is? Don't "ask" LLM, those things output garbage. Just google it.

Watch out those software bugs may start crawling out of your keyboard

Like, literal garbage? The one sitting in my kitchen bin?!?!?!?!?!?!?!??!?!?!?!?!?!??!!?!

No fucking shit it’s an anthropomorphization, nothing that can be hosted on GitHub has true human qualities…

The point is that everyone knows what it means within that context of AI, and using other terminology would only serve to obfuscate your message such that the average person couldn’t understand it as easily.

Non-living things also don’t have “behavior” (“the way in which someone conducts oneself or behaves”, but—hey look! People started anthropomorphizing things so much that it got added to the dictionary! (“the way in which something functions or operates”.)

It may not be ideal, and convince some people that LLMs are more human-like than they really are, but the one thing you haven’t done is suggest an alternative that would convey its meaning as effectively to the masses.

You call it a large language model, but there are much bigger things, it's only approximating a human language, and it isn't a physical model.

1 more...