AI bots hallucinate software packages and devs download them

db0@lemmy.dbzer0.com to Technology@lemmy.world – 399 points –
AI bots hallucinate software packages and devs download them
theregister.com
110

You are viewing a single comment

when WE hallucinate, it's because our internal predictive models are flying off the rails filling in the blanks based on assumptions rather than referencing concrete sensory information and generating results that conflict with reality.

Is it really? You make it sound like this is a proven fact.

Is it really? You make it sound like this is a proven fact.

I believe that's where the scientific community is moving towards, based on watching this Kyle Hill video.

Here is an alternative Piped link(s):

this Kyke Hill video

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source; check me out at GitHub.

I know I'm responding to a bot, but... how does a PipedLinkBot get "Kyle Hill" wrong to "Kyke Hill"? More AI hallucinations?

i mean, idk about the assumptions part of it, but if you asked a psych or a philosopher, im sure they would agree.

Or they would disagree and have about 3 pages worth of thoughts to immediately exclaim otherwise they would feel uneasy about their statement.

Better than one of those pesky unproven facts