Finally

Like A Duck@programming.dev to Programmer Humor@programming.dev – 376 points –
32

Most of the codes I copied from GPT doesn't even work Seems I spent more time fixing it compared to thinking it myself

Seems I spent more time fixing it compared to thinking it myself

AI code is just bad code written by someone else that I now have to fix, and we all know the one job every coder loves is fixing code written by someone who you cannot ask: "why did you do it this way?"

It can be helpful though if you just need to find the right library function of proper syntax in a foreign language. Usually faster than a search

Often when I tell ChatGPT what error its code produced it will immediately figure out what the bug was and fix it.

interacting with chatgpt is a learned skill

i’ve used it several times and while the initial code may have some issues you can get them cleared up with a few direct follow ups

I recall reading a while back of one person's strategy, whenever ChatGPT generates code for him he immediately tells ChatGPT "there's a bug in that code" (without checking or specifying). It'll often find one.

Another approach I've heard of is to tell ChatGPT that it's supposed to roleplay two roles when generating code, a programmer and a code reviewer. The code reviewer tidies up the initial code and fixes bugs.

Since often ChatGPT's code works fine for me I don't usually bother with these steps initially, since I'm usually just wanting a quick and dirty script for a one-off task the quality doesn't matter much in my case.

And you know what you call changing words around to get a computer to do what you want? That's programming, baby! We are programming programmers!

Yeah, this is the way how to interact with it. It makes sense as well, because it's only predicting the next word based on the previous words, so it had can in hindsight find a lot more stuff and in general be smarter about it.

I do this with TypeScript error codes. It’s great at breaking down the problem. I never just copy paste code from it and I don’t think anyone should do that anyway.

The art is the same. AI is just like asking an art student to draw you a picture. Might be good, might look terrible. Don't ask for hands.

I’ve never had an issue with it personally but I’ve never asked it to do anything complex either

I've had success with trivial things, like write a log file parser with this pattern, or give me a basic 3 part left-right-center header in html. Works ok for trivial side projects. I would never trust it in production. Its a tool, nothing more at this point. Like an electric drill, better than a hand crank, but you still need to know how to use it.

Yea I genuinely hope we get to the point where I don't have write code. Let me describe the architecture and algorithms with my voice in English. I'd much rather spend my time solving abstract problems than typing syntax. If I have to essentially "teach" the AI what I want by dropping down the ladder of abstraction sometimes, that's OK.

Me a CS student that bet all my future in the field: yes totally happy about this

Me, with 20 years experience making software: yes, totally happy about this. (This makes it much easier to keep up with the latest newfangled bullshit.)

It won’t, because AI will generate based on existing content, but will not make something new

Wonder if human knowledge is now going to start degrading like a reposted JPEG as AI generated information is recycled again and again into more AI systems.

2 more...

If AI can write code based on English input then we should be able to feed it a spec and then just deploy to production the output.

And, as always, attempting to code to that spec will expose contradictions, inconsistencies, and frequently produce something that the customer judges as unfit for purpose.

Coding has never been the toughest problem, except in the matter of security attacks.