With the creation and use of ChatGPT and Bard, I bet ELI5 community and subreddit have had less posts, because people can just ask them instead.

TehBamski@lemmy.world to Showerthoughts@lemmy.world – 106 points –
33

You are viewing a single comment

I use ChatGPT just for programming and it gives wrong answers half of the time.

i'm studying mechanical engineering and there's a guy in our class who's obsessed with chatgpt. he's always trying to solve all of the tasks using chatgpt and he's always the first to share the solution in zoom. so far it's never been correct but he just sticks with it...

I am a mechanical engineer. I was able to get special permission from my IT department to use LLMs as part of my workflow as a genie pig for the department. It is completely useless.

One the most valuable skill an engineer can have is being able to communicate technical information effectively to different audiences. GPT is on overly polite meat grinder, spitting out half chewed technical slop.

AI will have a better sense of something like mechanical engineering when it’s inhabited a body for a while.

If I'm recalling right, I asked Chatgpt re banked turn with friction. Didn't give the answer I was looking for.

I asked Chatgpt re the best big phones of 2022. 1 of the phones it cited was released in 2021.

Yeah I also do and it is indeed frequently incorrect. It is good when you have like no idea about what you're doing. It can help you get on track and then you can research by yourself.

Not picking fights. Just curious.

Is this an improvement or a decline in your overall code programming success?

I am a hobbyist (and not very good) programmer, and while ChatGPT (free version) often gives me wrong answers, it still gives me some insight on how some stuff could be done (intentionally or not) or how something works and is actually somewhat helpful in learning stuff, but I guess this could be double-edged sword even in that regard.

It is also pretty good at detecting simple code errors, from what I have seen.

Overall more positive than negative, but I wouldn't recommend to use it blindly.

Huge improvement in work flow.

Don’t get it to write your code for you, it’s not gonna work 3/10 times. Instead use it to review your code, help remove any code smells for refactoring.

I don't use chatGPT, but work with colleagues who do. They're productivity visibly drops and half the time I gotta fix their shitty code.

I use chatGPT for any topic I'm curious about, and like half the time when i double check the answers it turns out they're wrong.

For example i asked for a list of phones with screens that don't use PWM, and when i looked up the specs of the phones it recommended it turned out they all had PWM, even though in the chatGPT answer it explicitly stated that each of these phones don't use PWM. Why does it straight up lie?!

It's not lying. It has no concept of context or truth. Auto complete on steroids.