In Leaked Audio, Amazon Cloud CEO Says AI Will Soon Make Human Programmers a Thing of the Past

shish_mish@lemmy.world to Technology@lemmy.world – 471 points –
In Leaked Audio, Amazon Cloud CEO Says AI Will Soon Make Human Programmers a Thing of the Past
futurism.com
195

Guy who buys programmers and sells AI thinks he can sell more AI and stop buying programmers.

This is up there with Uber pretending self driving cars will make them rich.

I mean... self driving cars probably will. Just not as soon as they think. My guess, at least another decade.

Not until a self driving car can safely handle all manner of edge cases thrown at it, and I don’t see that happening any time soon. The cars would need to be able to recognize situations that may not be explicitly programmed into it, and figure out a safe way to deal with it.

As someone said on this thread: as soon as they can convince legislators, even if they are murder machines, capital will go for it.

Borrowing from my favorite movie: "it's just a glitch".

I doubt it. The liability would be far too great. Ambulance chasing lawyers would salivate at the chance to represent the families of pedestrians struck and killed by buggy self driving cars. Those capitalists don't want endless years of class action cases tying up their profits.

When was the last time a corporation got anything other than a slap on the wrist and a small donation to the government just so they could keep doing what they're doing?

Like Boeing. As much as I hate people saying dumb shit about a company they don't know much of anything about, Boeing is the epitome of what you said. A company getting a small slap on the wrist for gross negligence in the name of profit. Especially because of all the goodies they develope for the US Federal Government. And since they are a world wide company our government isn't the only one. They know they reside in a place of power because they fill a hole in an industry that basically has to be filled. And people want to try to bankrupt them with some weird ideas about voting with their dollar. But that's nonsense.

People don't understand about how they build planes not to sell but to lease. How these types of leases keep their customers paying out the nose for an asset they don't own, and responsible for the maintenance of that asset until it's time to upgrade. They cornered the market on enshitification long before the likes of Microsoft and Google, and they have mastered the art of it.

Tesla or Uber or whoever wish they could do what Boeing has been doing for decades. People have this rose tinted glasses view of what Boeing "used to be" when it was "run by engineers" etc. That's hilarious to me. Back in the day they hedged their bets in a race to the bottom to develop a two engined plane that wouldn't catastrophically fail and fall out of the sky if it lost an engine so they could skirt worldwide federal regulations that required planes to have more than two engines. This added to upkeep and fuel costs making it untenable and creating air travel that was incredibly expensive. And their engineers managed it, so they played the long game, basically allowing them to develop planes that were more fuel efficient and cost effective to maintenance meaning their customers could afford to buy more of them by providing air travel opportunities to more people.

You know what we got from that? Shittier seating arrangements, poorly manufactured planes, and baggage fees out the whazoo in addition to ever rising ticket prices for air travel.

Alternatively measures could be put in place to eliminate certain edge cases. You can see similar concepts in places with separate infrastructure for things like busses or HOV lanes. Places you could still ostensibly allow "regular" vehicles to travel but limit/eliminate pedestrians or merging.

Abolish snowfall

We're working on it, although for some parts of the world we will need to go through periods of increased snowfall to get there.

Tbf human operated cars are also murder machines, we just are more amenable to tolerating it

there will be a massive building in like india with many thousand of atrociously paid workers donning VR goggles who spend their long hours constantly Quantum Leap finding themselves in traumatizing last second emergency situations that the AI gives up on. Instantly they slam on the brakes as hard as they can. They drink tea. there's suicide netting everywhere. they were the lowest bidder this quarter.

I wish I could give this comment more than a simple upvote. I want to mail you a freshly baked cinnamon bun.

Plus, as soon as the cars can drive themselves people will stop needing Uber in many cases.

No parking? Just tell your car to go park on a street 10 blocks away.

Drunk? Car drives itself while you sleep.

Going to the airport? Car drops you off and returns home. Car also picks you up when you are back.

This is combined with the fact that people will do more disgusting things in an Uber without the driver there. If you have ever driven for Uber, you know that 10% of people are trying to eat or drink in the car. They are going to spill and it's going to end up like the back of a bus.

Not sure if we're agreeing and saying exactly the same thing here, but Uber's business model is to get suckers who are bad at math to own the cars. Uber's business model does not work if they have to own their own cars. Self-driving Uber doesn't work because Uber would have to own the cars and therefore has to cover vehicle insurance, vehicle depreciation, and so on out of its own margin.

Their accident rate continues to decrease and things like quorum sensing and platooning are going to push them to be better than humans. You're never going to have a perfect system that never has accidents, but if you're substantially better than humans in accidents per mile driven and you're dramatically improving throughput and reducing traffic through V2X, it's going to make sense to fully transition.

I imagine some east Asian countries will be the first to transition and then the rest of the world will begrudgingly accept it once the advantages become clear and the traditional car driving zealots die off.

1 more...

"handle" is doing a lot of heavy lifting there. The signs are already there that all of these edge cases will just be programmed as "safely pull over and stop until conditions change or a human takes control". Which isn't a small task in itself, but it's a lot easier than figuring out to continue (e.g.) on ice.

Those self-driving cars are called trains. They already can be self-driving. In a situation where the computational complexity and required precision are somewhat controlled, that is, on train tracks.

Just like all humans can do right now, right?

I never see any humans on the rode staring at their phone and driving like shit.

The problem with self-driving cars isn't that it's worse than human drivers on average, it's that it's SO INCREDIBLY BAD when it's wrong that no company would ever assume the liability for the worst of its mistakes.

But if the average is better, then we’re will clearly win by using it. I’m not following the logic of tracking the worst case scenarios as opposed to the average.

Average is better means fewer incidents overall. But when there are incidents, the damages for those incidents tend to be much worse. This means the victims are more likely to lawyer up and go after the company responsible for the AI that was driving, and that means that the company who makes the self-driving software better be prepared to pay for those worst case scenarios, which will now be 100% their fault.

Uber can avoid liability for crashes caused by their human drivers. They won't be able to do the same when their fleet is AI. And when that happens, AI sensibilities will be measured my human metrics because courts are run by humans. The mistakes that they make will be VERY expensive ones, because a minor glitch can turn an autonomous vehicle from the safest driving experience possible to a rogue machine with zero sense of self-preservation. That liability is not worth the cost savings of getting rid of human drivers yet, and it won't be for a very long time.

1 more...

Maybe, or maybe like harnessing fusion it will always be “just a few more years away!”

Self driving taxis are definitely happening, but the people getting rich in a gold rush are the people selling shovels.

Uber has no structural advantage because their unique value proposition is the army of cheap drivers.

We're a century away from self-driving cars that can handle snowfall

Just this year farmers with self-driving tractors got screwed because a solar flare made GPS inaccurate and so tractors went wild because they were programmed with the assumption of GPS being 100% reliable and accurate with no way to override

I'm right there with you, but I also remember hearing that this time last decade.

1 more...
1 more...

I hope this helps people understand that you don't get to be CEO by being smart or working hard. It's all influence and gossip all the way up.

In fact, being stupid is probably a benefit.

Yep if I had that kind of money and surrounded by like minded people I'd agree. Unfortunately I'm cursed with a rational mind 🙃🙃🙃

"Coding" was never the source of value, and people shouldn’t get overly attached to it. Problem solving is the core skill. The discipline and precision demanded by traditional programming will remain valuable transferable attributes, but they won’t be a barrier to entry. - John Carmack

This right here.

Problem is not coding. Anybody can learn that with a couple of well focused courses.

I'd love to see an AI find the cause of a catastrophic crash of a machine that isn't caused by a software bug.

Catching up on what Carmack's been up to for the last decade has revived the fan in me. I love that 2 years after leaving Oculus to focus on AGI, this is all the hype he's willing to put out there.

Agreed! Problem solving is core to any sort of success. Whether you're moving up or on for more pay, growing tomatoes or nurturing a relationship, you're problem solving. But I can see AI putting the screws to those of us in tech.

Haven't used it much so far, last job didn't afford much coding opportunity, but I wrote a Google Apps script to populate my calendar given changes to an Excel sheet. Pretty neat!

With zero experience App scripting, I tried going the usual way, searching web pages. Got it half-ass working, got stuck. Asked ChatGPT to write it and boom, solved with an hour's additional work.

You could say, "Yeah, but you at least had a clue as to general scripting and still had to problem solve. Plus, you came up with the idea in the first place, not the AI!" Yes! But point being, AI made the task shockingly easier. That was at a software outfit so I had the oppurtuniy to chat with my dev friends, see what they were up to. They were properly skeptical/realistic as to what AI can do, but they still used it to great effect.

Another example: Struggled like hell to teach myself database scripting, so ignorant I didn't know the words to search and the solutions I found were more advanced answers than my beginner work required (or understood!). First script was 8 short lines, took 8 hours. Had AI been available to jump start me, I could have done that in an hour, maybe two. That's a wild productivity boost. So while AI will never make programmers obsolete, we'll surely need fewer of them.

"Guy who was fed a pay-to-win degree at a nepotism practicing school with a silver spoon shares fantasy, to his fan base that own large publications, about replacing hard working and intelligent employees with machines he is unable to comprehend the most basic features of"

They've been saying this kind of bullshit since the early 90s. Employers hate programmers because they are expensive employees with ideas of their own. The half-dozen elite lizard people running the world really don't like that kind of thing.

Unfortunately, I don't think any job is truly safe forever. For myriad reasons. Of course there will always be a need for programmers, engineers, designers, testers, and many other human-performed jobs. However, that will be a rapidly changing landscape and the number of positions will be reduced as much as the owning class can get away with. We currently have large teams of people creating digital content, websites, apps, etc. Those teams will get smaller and smaller as AI can do more and more of the tedious / repetitive / well-solved stuff.

I’m relaxed. IMHO this is just another trend.

In all my career I haven’t seen a single customer who was able to tell me out of the box what they need. Big part of my job is to talk to all entities to get the big picture. Gather information about Soft- and Hardware interfaces, visit places to see PHYSICAL things like sub processes or machines.

My focus may be shifted to less coding in an IDE and more of generating code with prompts to use AI as what it is: a TOOL.

I’m annoyed of this mentality of get rich quick, earn a lot of money with no work, develop software without earning the skills and experience. It’s like using libraries for every little problem you have to solve. Worst case you land in dependency/debug hell and waste much more time debugging stuff other people wrote than coding it by yourself and understanding how the things work under the hood.

And by that time, processors and open source AI are good enough that any noob can ask his phone to generate a new app from scratch. You'd only need big corpo for cloud storage and then only when distributed systems written by AI don't work.

the number of positions will be reduced as much as the owning class can get away with

Well, after all, you don't hire people to do nothing. It's simply a late-stage capitalism thing. Hopefully one day we can take the benefits of that extra productivity and share the wealth. The younger generations seem like they might move us that way in the coming decades.

I really hope so. Sometimes I think the kids are alright. Like the 12 year old owning the My Pillow idiot. Then I hear the horror stories from my school teacher friends.

when will ai replace ceos?

Mark Zuckerberg is not a robot ?

Lizardman. Easy to confuse the two as they’re both cold to the touch by default.

What kind of weird conspiracy theory is this?

Lizardmen are not robots.

No!!! They're useful because uhhmm uuhhhh uhmm uhhhbbh dndusfjduehrhrh

It's worth noting that the new CEO is one of few people at Amazon to have worked their way up from PM and sales to CEO.

With that in mind, while it's a hilariously stupid comment to make, he's in the business of selling AWS and its role in AI. Take it with the same level of credibility as that crypto scammer you know telling you that Bitcoin is the future of banking.

PM and sales, eh?

So you're saying his lack of respect for programmers isn't new, but has spanned his whole career?

Devalue another persons labour is what being an executive is all about.

As a wage slave with no bitcoin or crypto, the technology has been hijacked by these types and could otherwise have been useful.

I'm not entirely sold on the technology, especially since immutable ledgers have been around long before the blockchain, but also due to potential attack vectors and the natural push towards centralisation for many applications - but I'm just one man and if people find uses for it then good for them.

What other solutions to double spending were there in financial cryptography before?

No idea, I don't work in fintech, but was it a fundamental problem that required a solution?

I've worked with blockchain in the past, and the uses where it excelled were in immutable bidding contracts for shared resources between specific owners (e.g. who uses this cable at x time).

Fully decentralized p2p cryptocurrency transactions without double spending by proof of work (improvement upon Hashcash) was done first with Bitcoin. The term fintech did not exist at the time. EDIT: looked it up, apparently first use as Fin-Tech was 1967 https://en.wikipedia.org/wiki/Fintech -- it's not the current use of the term though.

I guess additional bonus for crypto would be not burning the planet, and actuallt have a real value of something, not the imagined one.

Yeah hows that goin’?

It can write really buggy Python code, so... Yeah, seems promising

It does a frequently shitty job of writing docstrings for simple functions, too!

It does great with common code patterns that it can just drop in place. 99.9% Artificial, 0.01% intelligence.

When I last tried to let some AI write actual code, it didn't even compile 🙂 And another time when it actually compiled it was trash anyway and I had to spend as much time fixing it, as I would have spent writing it myself in the first place.

So far I can only use AI as a glorified search engine 😅

Lol sure, and AI made human staff at grocery stores a thing of the....oops, oh yeah....y'all tried that for a while and it failed horribly....

So tired of the bullshit "AI" hype train. I can't wait for the market to crash hard once everybody realizes it's a bubble and AI won't magically make programmers obsolete.

Remember when everything was using machine learning and blockchain technology? Pepperidge Farm remembers...

It's the pinnacle of MBA evolution.

In their worldview engineers are a material, and all that matters in the world is knowing how to do business. So it just makes sense that one can guide and use and direct engineers to replace themselves.

They don't think of fundamentals, they really believe it's some magic that happens all by itself, you just have to direct energy and something will come out of it.

Lysenko vibes.

This wouldn't happen were not the C-suite mostly comprised of bean counters. They really think they are to engineers what officers are to soldiers. The issue is - an officer must perfectly know everything a soldier knows and their own specialty, and also bears responsibility. Bean counters in general less education, experience and intelligence than engineers they direct, and also avoid responsibility all the time.

So, putting themselves as some superior caste, they really think they can "direct progress" to replace everyone else the way factories with machines replaced artisans.

It's literally a whole layer of people who know how to get power, but not how to create it, and imagine weird magical stuff about things they don't know.

I am a black box

Yeah, that's what I mean. Black boxes are a concept to accelerate development, but we can't blackbox ourselves through civilization. They are also mostly useful for horizontal, not vertical relationships, which people misunderstand all the time (leaky abstractions).

This actually should make us optimistic. If hierarchical blackboxing were efficient, it would be certain that state of human societies will become more and more fascist and hierarchical over time, while not slowing down in development. But it's not.

Just the other day, the Mixtral chatbot insisted that PostgreSQL v16 doesn't exist.

A few weeks ago, Chat GPT gave me a DAX measure for an Excel pivot table that used several DAX functions in ways that they could not be used.

The funny thing was, it knew and could explain why those functions couldn't be used when I corrected it. But it wasn't able to correlate and use that information to generate a proper function. In fact, I had to correct it for the same mistakes multiple times and it never did get it quite right.

Generative AI is very good at confidently spitting out inaccurate information in ways that make it sound like it knows what it's talking about to the average person.

Basically, AI is currently functioning at the same level as the average tech CEO.

Spoken like someone who manages programmers instead of working as one.

The job of CEO seems the far easier to replace with AI. A fairly basic algorithm with weighted goals and parameters (chosen by the board) + LLM + character avatar would probably perform better than most CEOs. Leave out the LLM if you want it to spout nonsense like this Amazon Cloud CEO.

Cheaper too I bet.

Also lol for the AI coder 😁 good luck with that 😂

Worst case scenario the ai fucking loses it and decides to do some wacky but weirdly effective shit. Like spamming out 1 width units en masse in Hearts of Iron 4.

I just want to remind everyone that capital won't wait until AI is "as good" as humans, just when it's minimally viable.

They didn't wait for self-checkout to be as good as a cashier; They didn't wait for chat-bots to be as good as human support; and they won't wait for AI to be as good as programmers.

And then we should all charge outrageous hourly rates to fix the AI generated code.

They'll try the opposite. It's what the movie producers did to the writers. They gave them AI generated junk and told them to fix it. It was basically rewriting the whole thing but because now it was "just touching up an existing script" it was half price.

They can try. But cleaning up a mess takes a while and there’s no magic wand to make it ho faster.

Yeah they'll try. Surely that can't cascade into a snowball of issues. Good luck for them 😎

A strike with tech workers would be something else. Curious what would happen if the one maintaining the servers for entertainment, stock market or factories would just walk out. On the other hand, tech doesn't have unions.

You better fucking believe it.

AIs are going to be the new outsource, only cheaper than outsourcing and probably less confusing for us to fix

They won't, and they'll suffer because of it and want to immediately hire back programmers (who can actually do problem solving for difficult issues). We've already seen this happen with customer service reps - some companies have resumed hiring customer service reps because they realized AI isn't able to do their jobs.

And because all the theft and malfunctions, the nearby supermarkets replaced the self checkout by normal cashiers again.

If it's AI doing all the work, the responsibility goes to the remaining humans. They'll be interesting lawsuits even there's the inevitable bug that the AI itself can't figure out.

We saw this happen in Amazon's cashier-less stores. They were actively trying to use a computer based AI system but it didn't work without thousands of man hours from real humans which is why those stores are going away. Companies will try this repeatedly til they get something that does work or run out of money. The problem is, some companies have cash to burn.

I doubt the vast majority of tech workers will be replaced by AI any time soon. But they'll probably keep trying because they really really don't want to pay human beings a liveable wage.

Unexpected item in bagging are? I think you meant free item in bagging area.

Already happening. Cisco just smoked another 4,000 employees. And anecdotally, my tech job hunt is, for the first time, not going so hot.

It's really funny how AI "will perform X job in the near future" but you barely, if any, see articles saying that AI will replace CEO's in the near future.

Somewhere there is a dev team secretly programming an AI to take over bureaucratic and manegerial jobs but disguising it as code writing AI to their CTO and CEO

C-suites are like Russian elites.

The latter are some thieves who've inherited a state from Soviet leadership. They have a layman's idea of what a state and a country is, what history itself is, plus something that a taxi driver would say. In the last 20 years they are trying to apply that weird idea to reality, as if playing Hearts of Iron, because they want to be great and to be in charge of everything that happens.

The former have heard in school that there were industrial revolutions and such, and they too want to be great and believe in every such stupid hype about someone being replaced with new great technology, and of course they want to be in charge of that process.

While in actuality with today's P2P technologies CEO's are the most likely to be replaced, if we use our common sense, but without "AI", of course. Just by decentralized systems allowing much bigger, more powerful and competitive cooperatives than before, and those that form and disband very easily.

'Soon' is a questionable claim from a CEO who sells AI services and GPU instances. A single faulty update caused worldwide down time recently. Now, imagine all infrastructure is written with today's LLMs - which are sometimes hallucinate so bad, they claim 'C' in CRC-32C stands for 'Cool'.

I wish we could also add a "Do not hallucinate" prompt to some CEOs.

Meanwhile, llms are less useful at helping me write code than intellij was a decade ago

I'm actually really impressed with the auto complete intellij is packaged with now. It's really good with golang (probably because golang has a ton of code duplication).

Extremely misleading title. He didn't say programmers would be a thing of the past, he said they'll be doing higher level design and not writing code.

Even so, he's wrong. This is the kind of stupid thing someone without any first hand experience programming would say.

Yeah, there are people who can "in general" imagine how this will happen, but programming is exactly 99% not about "in general" but about specific "dumb" conflicts in the objective reality.

People think that what they generally imagine as the task is the most important part, and since they don't actually do programming or anything requiring to deal with those small details, they just plainly ignore them, because those conversations and opinions exist in subjective bendable reality.

But objective reality doesn't bend. Their general ideas without every little bloody detail simply won't work.

Not really, it's doable with chatgpt right now for programs that have a relatively small scope. If you set very clear requirements and decompose the problem well it can generate fairly high quality solutions.

This is incorrect. And I'm in the industry. In this specific field. Nobody in my industry, in my field, at my level, seriously considers this effective enough to replace their day to day coding beyond generating some boiler plate ELT/ETL type scripts that it is semi-effective at. It still contains multiple errors 9 times out of 10.

I cannot be more clear. The people who are claiming that this is possible are not tenured or effective coders, much less X10 devs in any capacity.

People who think it generates quality enough code to be effective are hobbyists, people who dabble with coding, who understand some rudimentary coding patterns/practices, but are not career devs, or not serious career devs.

If you don't know what you're doing, LLMs can get you close, some of the time. But there's no way it generates anything close to quality enough code for me to use without the effort of rewriting, simplifying, and verifying.

Why would I want to voluntarily spend my day trying to decypher someone else's code? I don't need chatGPT to solve a coding problem. I can do it, and I will. My code will always be more readable to me than someone else's. This is true by orders of magnitude for AI-code gen today.

So I don't consider anyone that considers LLM code gen to be a viable path forward, as being a serious person in the engineering field.

It's just a tool like any other. An experienced developer knows that you can't apply every tool to every situation. Just like you should know the difference between threads and coroutines and know when to apply them. Or know which design pattern is relevant to a given situation. It's a tool, and a useful one if you know how to use it.

This is like applying a tambourine made of optical discs as a storage solution. A bit better cause punctured discs are no good.

A full description of what a program does is the program itself, have you heard that? (except for UB, libraries, ... , but an LLM is no better than a human in that too)

right now not a chance. it's okay ish at simple scripts. it's alright as an assistant to get a buggy draft for anything even vaguely complex.

ai doing any actual programming is a long ways off.

I heard a lot of programmers say it

Edit: why is everyone downvoting me lol. I'm not agreeing with them but I've seen and met a lot that do.

They're falling for a hype train then.

I work in the industry. With several thousand of my peers every day that also code. I lead a team of extremely talented, tenured engineers across the company to take on some of the most difficult challenges it can offer us. I've been coding and working in tech for over 25 years.

The people who say this are people who either do not understand how AI (LLMs in this case) work, or do not understand programming, or are easily plied by the hype train.

We're so far off from this existing with the current tech, that it's not worth seriously discussing.

There are scripts, snippets of code that vscode's llm or VS2022's llm plugin can help with/bring up. But 9 times out of 10 there's multiple bugs in it.

If you're doing anything semi-complex it's a crapshoot if it gets close at all.

It's not bad for generating psuedo-code, or templates, but it's designed to generate code that looks right, not be right; and there's a huge difference.

AI Genned code is exceedingly buggy, and if you don't understand what it's trying to do, it's impossible to debug because what it generates is trash tier levels of code quality.

The tech may get there eventually, but there's no way I trust it, or anyone I work with trusts it, or considers it a serious threat or even resource beyond the novelty.

It's useful for non-engineers to get an idea of what they're trying to do, but it can just as easily send them down a bad path.

People use visual environments to draw systems and then generate code for specific controllers, that's in control systems design and such.

In that sense there are already situations where they don't write code directly.

But this has nothing to do with LLMs.

Just for designing systems in one place visual environments with blocks might be more optimal.

And often you still have actual developers reimplementing this shit because EE majors don't understand dereferencing null pointers is bad

Had to do some bullshit ai training for work. Tried to get the thing to remake cmatrix in python.

Yeah no, that's not replacing us anytime soon, lmao.

So they would be doing engineering and not programming? To me that sounds like programmers would be a thing of the past.

Sounds like he's just repeating a common meme. I don't see anything about higher level design that would make it more difficult for an AI (hypothetical future AI, not the stuff that's available now) compared to lower level tasks.

We’ll be able to use the newly found time to realise our dream of making PMs redundant by automating them.

How is "not writing code" different from programmers being a thing of the past?

What do you think programmers do?

AI is terrible at solving real problems thru programming. As soon as the problem is not technical in nature and needs a decision to be made based on experience, it falls flat on its face.

It will never understand context and business rules and things of that nature to the same extent that actual devs do.

We are now X+14 months away from AI replacing your job in X months.

Can AI do proper debugging and troubleshooting? That's when I'll start to get worried

It can generate text that looks like a plausible troubleshooting result but is completely detached from reality. Good enough for most CEOs

Well, that would be the 3rd or 4th thing during my career that was supposed to make my job a thing of the past or at least severely reduce the need for it.

(If I remember it correctly, OO design were supposed to reduce the need for programmers, as were various languages, then there was Outsourcing, visual programming and on the server-side I vaguely remember various frameworks being hailed as reducing the need for programmers because people would just be able to wire modules together with config or some shit like that. Additionally many libraries and frameworks out there aim to reduce the need for coding)

All of them, even outsourcing, have made my skills be even more in demand - even when they did reduce the amount of programming needed without actually increasing it elsewhere (a requirement were already most failed) the market for software responded to that by expecting the software to do more things in more fancy ways and with data from more places, effectively wiping out the coding time savings and then some.

Granted, junior developers sometimes did suffer because of those things, but anything more complicated than monkey-coder tasks has never been successfully replaced, fully outsourced or the need for it removed, at least not without either the needs popping up somewhere else or the expected feature set of software increasing to take up the slack.

In fact I expect AI, like Outsourcing before it, in a decade or so is going to really have screwed the Market for Senior Software Engineers from the point of view of Employers (but a golden age for Employees with those skills) by removing the first part of the career path to get to that level of experience, and this time around they won't even be able to import the guys and galls in India who got to learn the job because the Junior positions were outsourced there.

i didn't start my tech career after high school because every career advice i got was "all jobs going to india." could've had 10 more year's experience but instead i joined the military. ugh!

I taught myself Python in part by using ChatGPT. Which is to say, I coaxed it through the process of building my first app, while studying from various resources, and using the process of correcting its many mistakes as a way of guiding my studies. And I was only able to do this because I already had a decent grasp of many of the basics of coding. It was honestly an interesting learning approach; looking at bad code and figuring out why it's bad really helps you to get those little "Aha" moments that make programming fun. But at the end of the day it only serves as a learning tool because it's an engine for generating incompetent results.

ChatGPT, as a tool for creating software, absolutely sucks. It produces garbage code, and when it fails to produce something usable you need a strong understanding of what it's doing to figure out where it went wrong. An experienced Python dev could have built in a day what took me and ChatGPT a couple of weeks. My excuse is that I was learning Python from scratch, and had never used an object oriented language before. It has no excuse.

ChatGPT only gives good answers if you ask the right questions and to do that you have to be better than a novice. It's great as a rubber ducky that answers back but it's usefulness is a reflection of the user.

I've seen what Amazon produces internally for software, I think the LLMs could probably do a better job.

I'd believe AI will replace human programmers when I can tell it to produce the code for a whole entire video game in a single prompt that is able to stand up to the likes of New Vegas, has zero bugs, and is roughly hundreds of hours of content upon first play due to vast exploration.

In other words, I doubt we'll see human programmers going anywhere any time soon.

Edit:

Reading other replies made me remember how I once, for fun, tried using a jailbroken copilot program to do python stuff slightly above my already basic coding skill and it gave me code that tried importing something that absolutely doesn't exist. I don't remember what it was called ince I deleted the file while cleaning up my laptop the other day, but I sure as hell looked it up before deleting it and found nothing.

Could you imagine Microsoft replacing windows engineers with a chat gpt prompt? What would that prompt even look like?

No matter the human expense, the green line must go up.

To be honest, this could be an example of where AI could do marginally better. I don't mean that because of code quality or functionality. I mean it in the sense of MS software getting absolutely fucked by internal competition and stack-ranking fostered during the Balmer years. The APIs are inconsistent and there is a ton of partially implemented stuff that will never be pushed to completion because everyone who worked on it was fired.

An AI might be able to implement things without intentionally sabotaging itself but, because LLMs are in the forefront of what would be used and do not have the capability of intention or understanding context, I'm a bit pessimistic.

1 more...

But you have to describe what it is. If only we had universal languages to do that... Oh yeah, it's code.

Current AI is good at compressing knowledge.

Best job role: information assistant or virtual secretary.

20 years ago at a trade show, a new module based visual coding tool was introduced in my field which claimed "You'll never need another programmer".

Oddly enough, I still have a job.

The tools have gotten better, but I still write code every day because procedural programming is still the best way to do things.

It is just now reaching the point that we can do some small to medium scale projects with plug and play systems, but only with very specific equipment and configurations.

20 years ago while learning web development Dreamweaver was going to supposedly eliminate the need for code on websites too. lol

But sadly, the dream of eliminating us seems like it will never die

20 years ago at a trade show, a new module based visual coding tool was introduced in my field which claimed “You’ll never need another programmer”.

It's because people trying to sell silver bullets is nothing new.

The pace of change is about every five years, and some elements are always in transition.

All in one turn key solutions are always one to two cycles behind, so may work great with the stuff I'm already replacing.

I think these are honest attempts to simplify, but by the time they have it sorted its obsolete. If I have to build modules anyway to work with new equipemnt, might as well just write all the code in my native language.

These also tend to be attempts at all in one devices, requiring you to use devices only compatible with those subsystems. I want to be able to use best tech from what ever manufacturer. New and fancy almost always means a command line interface, which again means coding.

That guy has never seen AI code before. It regularly gets even simple stuff wrong. Was he especially good is when it gives made up crap. Or it tells you a method or function you can use but doesn't tell you where it got that. And then you're like "oh wow I didn't realize that was available" and then you try it and realize that's not part of the standard library and you ask it "where did you get that" and it's like "oh yeah sorry about that I don't know".

My absolute favorite is when I asked copilot to code a UI button and it just pasted "// the UI element should do (...) but instead it is doing (...)" a dozen times.

Like, clearly someone on stackoverflow asked for help, got used for training data, and confused copilot

The first thing AI gonna replace is CEO, dumb ass job, Mac Donald employer require more expertise

Until an AI can get clear, reasonable requirements out of a client/stakeholder our jobs are safe.

So never right?

If the assumption is that a PM holds all the keys…

For like, a couple years, sure. Then there will be a huge push to fix all the weird shit generated by AI.

I don't get how it's not that AI would help programmers build way better things. if it can actually replace a programmer I think it's probably just as capable of replacing a CEO. I bet it's a better use case to replace CEO

You can hire a lot of programmers for the cost of one CEO.

It's a bit nuts actually when I think about it. AI could really be useful for replacing CEO's. The more I think about it the more it makes sense. Its one career that makes sense for it to replace.

Something I've always found funny about the "AI will replace programmers soon" is that this means AI's can create AI's and isn't this basically the end of the economy?

Every office worker is out of a job just like that and labourers only have as long as it takes to sort out the robot bodies then everyone is out of a job.

You thought the great recession was bad? You ain't seen nothing!

But like I just keep going back to the idea that no invention in history ever had reduced our work load. It has always only shifted the work to the other end of what the invention can produce. I just always expect a human is still need to glue some process together and in that niche little area, whole industries are created and the rest of us have to learn it

Let me weigh in with something. The hard part about programming is not the code. It is in understanding all the edge cases, making flexible solutions and so much more.

I have seen many organizations with tens of really capable programmers that can implement anything. Now, most management barely knows what they want or what the actual end goal is. Since managers aren't capable of delivering perfect products every time with really skilled programmers, if i subtract programmers from the equation and substitute in a magic box that delivers code to managers whenever they ask for it, the managers won't do much better. The biggest problem is not knowing what to ask for, and even if you DO know what to ask for, they typically will ignore all the fine details.

By the time there is an AI intelligent enough to coordinate a large technical operation, AIs will be capable of replacing attorneys, congressmen, patent examiners, middle managers, etc. It would really take a GENERAL artificial intelligence to be feasible here, and you'd be wildly optimistic to say we are anywhere close to having one of those available on the open market.

I agree with you completely, but he did say no need for 'human programmers' not 'human software engineers. The skill set you are describing is one I would put forward is one of if not the biggest different between the two.

This is really splitting hairs, but if you asked that cloud CEO if he employed programmers or 'software engineers' he would almost certainly say the latter. The larger the company, the greater the chance they have what they consider an 'engineering' department. I would guess he employs 0 "programmers" or 'engineeringless programmers'.

Anyone in software engineering will tell you that as you get more senior you spend less time writing lines of code and more time planning, designing, testing, reviewing, and deleting code.

This will continue to be true, it's just that there will be less juniors below who's whole job is to produce code that meets a predefined spec or passes an existing test, and instead a smaller number of juniors will use AI tools to increase their productivity, while still requiring the same amount of direction and oversight. The small amounts of code the seniors write will also get smaller and faster to write, as they also use AI tools to generate boilerplate while filling in the important details.

That is what happens when you mix a fucking CEO with tech "How many workers can I fire to make more money and boast about my achievements in the annual conference of mega yacht owners" where as the correct question should obviously have always been (unless you are a psychopath) "how can I use this tech to boost productivity of my workers so they can produce the same amount of work in less amount of time and have more personal time for themselves"

Also these idiots always forget the "problem solving" part of most programming tasks which is still beyond the capability of LLMs. Sure have LLMs do the mundane stuff so that programmers can spend time on stuff that is more rewarding? No instead lets try to fire everyone.

And anyone who believes that should be fired, because they don't understand the technology at all or what is involved in programming for that matter. At the very least it should make everyone question the company if its leadership doesn't understand their own product.

To predict what jobs AI will replace, you need to know both of the following:

  1. What's special about the human mind that makes people necessary for completing certain tasks
  2. What AI can do to replicate or replace those special features

This guy has an MA in industrial engineering and an MBA, and has been in business his whole career. He has no knowledge of psychology and whatever knowledge of AI that he's picked up on the side as part of his work.

He's not the guy to ask. And yet, I feel like this is the only kind of guy anyone asks.

The sentiment on AI in the span of 10 years went from "it's inevitable it will replace your job" to "nope not gonna happen". The difference back then the jobs it was going to replace were not tech jobs. Just saying.

From the very beginning people were absolutely making connections between ai and tech jobs like programming.

The fuck are you talking about? Are you seriously trying to imply that now that it’s threatening tech jobs (it’s not) suddenly the narrative around how useful it will be changed (it didn’t)

From the very beginning

When is that exactly do you have in mind? I'm talking about automation which roughly around 2010 the discourse was primarily centered around blue collar jobs. The discussion was about these careers becoming obsolete if AI ever advanced to the point where it involved little to no humans to perform the tasks.

Back then AI with regards to white collar jobs was no where near the primary focus of discourse much less programming.

Tech nerds back then were all gung ho about it making entire careers obsolete in the near future. Truck drivers were supposed to be a dead career by now. They absolutely do not hold the same enthusiasm right now when it's being said about their own careers.

Are you seriously trying to imply

You're way off the mark. Save your outrage.

Until you ask it to do something never done before and it has a meltdown.

Yeah nah. We already have copilot and it introduces so many subtle bugs.

I await the day that AI makes CEOs a thing of the past....

When my job was outsouced a few years back, I was thinking there is probably a boat load of indien coming out of management schools that would do a great job at C level ! For a fraction of the price.

I admit that I work faster with AI help and if people get more stuff done in less time there might be less billable hours in the future for us. But AI did not replace me, a 10 times cheaper dude from India did.

Wasn't it the rabbit 1 scammer who said programmers would be gone in 5 years, like 3 years ago?

I managed to get an AI to build pong in assembly. Are are pretty cool things, but not sci-fi level just yet, but I didn't just say "build pong in assembly", I have to hand hold it a little bit. You need to be a programmer to understand how to guide the AI to do the task.

That was something very simple, I doubt that you can get it to do more complex tasks without a more lot of back and forth.

To give you an example I had a hard time getting it to understand that the ball needed to bounce off at an angle if intercepted at an angle, it just kept snapping it to 90° increments. I couldn't fix it myself because I don't really know assembly well enough to really get into the weeds with it so I was sort of stuck until I was finally able to get the AI to do what I wanted it to. I sort of understood what the problem was, there was a number somewhere in the system and it needed to make the number negative, but it just kept setting the number to a value. A non-programmer wouldn't really understand that's what the problem was and so they wouldn't be able to explain to the AI how to fix it.

I believe AI is going to become an unimaginably useful tool in the future and we probably don't really yet understand how useful it's going to be. But unless they actually make AGI it isn't going to replace programmers.

If they do make AGI all bets are off it will probably go build a Dyson Sphere or something at that point and we will have no way of understanding what it's doing.

Yeah, I don't see AI replacing any developers working on an existing, moderately complex codebase. It can help speed up some tasks, but it's far from being able to take a requirement and turn it into code that edits the right places and doesn't break everything.

I tried to get it to build a game of checkers, spent an entire day on it, in the end I could have built the thing myself. Each iteration got slightly worse, and each fix broke more than it corrected.

AI can generate an "almost-checkers" game nearly perfectly every time, but once you start getting into more complex rules like double jumping it just shits the bed.

What these headlines fail to capture is that AI is exceptionally good at bite sized pre-defined tasks at scale, and that is the game changer. Its still very far from being capable of building an entire app on its own. That feels more like 5-10 years out.

I haven't tried to scaffold whole projects, but otherwise that lines up with my usage of AI copilots so far.

At this point, they're good at helping you interface with the built in commands and maybe some specific APIs, but it won't do your thinking for you. It just removes the need for some specific base-level knowledge.

Most companies can't even give decent requirements for humans to understand and implement. An AI will just write any old stuff it thinks they want and they won't have any way to really know if it's right etc.

They would have more luck trying to create an AI that takes whimsical ideas and turns them into quantified requirements with acceptance criteria. Once they can do that they may stand a chance of replacing developers, but it's gonna take far more than the simpleton code generators they have at the moment which at best are like bad SO answers you copy and paste then refactor.

This isn't even factoring in automation testers who are programmers, build engineers, devops etc. Can't wait for companies to cry even more about cloud costs when some AI is just lobbing everything into lambdas 😂

Everyone was always joking about how AI should just replace CEOs, but it turns out CEOs are so easily lead by the nose that AI companies practically already run the show.

It's a good thing. After all, I don't care when Amazon goes down.

How much longer until cloud CEOs are a thing of the past? Wouldn't an AI sufficiently intelligent to solve technical problems at scale also be able to run a large corporate division? By the time this is actually viable, we are all fucked.

Don't worry guys. As long as project managers think "do the thing ... like the thing ... (waves hands around) ... you know ... (waves hands around some more) ... like the other thing ... but, um, ..., different" constitutes a detailed spec, we're safe.

Lol, as a programmer who uses generative AI myself, I would genuinely love to see them try.

Honestly I feel even an AI could write better code than what some big tech software uses lol

The paramount+ app doesn't even know how to properly hide the pause icon after you hit resume ffs. It's been months.

The thing that I see most is that AI is dumb and can’t do it yet so we don’t need to worry about this.

To me, it’s not about whether it can or not. If the people in charge think it can, they’ll stop hiring. There is a lot of waste in some big companies so they might not realize it’s not working right away.

Source: I work for a big company that doesn’t do things efficiently.

That big company will go through a crisis realize it's mistake and start quickly hiring again or it will fail and disappear

Its not like jobs will disappear in a single day. Incremental improvements will render lower level tasks obsolete, it already has to a degree.

Someone will still need to translate the business objectives into logical structure, via code, language, or whatever medium. Whether you call that a "coder" or not, is kind of irrelevant. The nerdy introverts will need to translate sales-douche into computer one way or another. Sales-douches are not going to be building enterprise apps from their techbro-hypespeak.

I worked at a different MAANG company and saw internal slides showing that they planned on being able to replace junior devs with AI by 2025. I don't think it's going according to plan.

At the end of the day, one thing people forget about with these things is that even once you hit a point where an AI is capable of writing a full piece of software, a lot of businesses will still pay money to have a human read through, validate it, and take ownership of it if something goes wrong. A lot of engineering is not just building something for a customer, but taking ownership of it and providing something they can trust.

I don't doubt that eventually AI will do almost all software writing, but the field of software companies isn't about to be replaced by non software people just blindly trusting an AI to do it right (and in legally compliant ways), anytime soon.

I don't think ai will replace my job any time soon when it's first thought for going through a 2d matrix was to go through it 500 thousand times and check each one with a CPU intensive process leading my pc to come to a halt until I force stopped the script.