It must be a silent R

Schal330@lemmy.world to Programmer Humor@programming.dev – 810 points –
125

You are viewing a single comment

In what way is presenting factually incorrect information as if it's true not a bad thing?

Maybe in a "it is not going to steal our job... yet" way.

True but if we suddenly invent an AI that can replace most jobs I think the rich have more to worry about than we do.

LLMs operate using tokens, not letters. This is expected behavior. A hammer sucks at controlling a computer and that's okay. The issue is the people telling you to use a hammer to operate a computer, not the hammer's inability to do so

Claude 3 nailed it on the first try

It would be luck based for pure LLMs, but now I wonder if the models that can use Python notebooks might be able to code a script to count it. Like its actually possible for an AI to get this answer consistently correct these days.