I predict a huge demand of workforce in five years, when they finally realized AI doesn't drive innovation, but recycles old ideas over and over.
I predict execs will never see this despite you being correct. We replaced most of our HR department with enterprise GPT-4 and now almost all HR inquiries where I work is handled through a bot. It daydreams HR policies and sometimes deletes your PTO days.
But can you convince it to report itself for its violations if you phrase it like it's a person?
No unfortunately. A lot of us fucked with it but it keeps logs of every conversation and flags abusive ones to management. We all got a stern talking to about it afterwards.
"Trust your tools". Not my fault the hammer was replaced by a banana.
I give you permission to replace HR with chatgpt. It just can't be any worse.
"Workforce" doesn't produce innovation, either. It does the labor. AI is great at doing the labor. It excels in mindless, repetitive tasks. AI won't be replacing the innovators, it will be replacing the desk jockeys that do nothing but update spreadsheets or write code. What I predict we'll see is the floor dropping out of technical schools that teach the things that AI will be replacing. We are looking at the last generation of code monkeys. People joke about how bad AI is at writing code, but give it the same length of time as a graduate program and see where it is. Hell, ChatGPT has only been around since June of 2020 and that was the beta (just 13 years after the first iPhone, and look how far smartphones have come). There won't be a huge demand for workforce in 5 years, there will be a huge portion of the population that suddenly won't have a job. It won't be like the agricultural or industrial revolution where it takes time to make it's way around the world, or where this is some demand for artisanal goods. No one wants artisanal spreadsheets, and we are too global now to not outsource our work to the lowest bidder with the highest thread count. It will happen nearly overnight, and if the world's governments aren't prepared, we'll see an unemployment crisis like never before. We're still in "Fuck around." "Find out" is just around the corner, though.
Even mindless and repetitive tasks require instances of problem solving far beyond what a.i is capable of. In order to replace 41% of the work force youβll need a.g.i and we donβt know if thats even possible.
Let's also not forget that execs are horrible at estimating work.
"Oh this'll just be a copy paste job right?" No you idiot this is a completely different system and because of xyz we can't just copy everything we did on a different project.
Or salesmen. "Oh, you have that another system to integrate with? No, no change in estimates, everything is OK."
Then they have a deal concluded etc, and then suddenly that information reaches the people who'll be actually doing it.
Its not replacing people outright its meaning each person is capable of doing more work each thus we only need 41% the people to achieve the same task. It will crash the job market. Global productivity and production will improve then ai will be updated repeat. Its just a matter of if we can scale industry to match the total production capacity of people with ai assistance fast enough to keep up. Both these things are currently exponential but the lag may cause a huge unemployment crisis in the meantime.
In this potential scenario, instead of axing 41% of people from the workforce, we should all get 41% of our lives back. Productivity and pay stay the same while the benefits go to the people instead of the corporations for a change. I know thatβs not how it ever works, but we can keep pushing the discussion in that direction.
You and I know damn well that a revolution is the only way that's gonna happen, and there aren't any on the horizon.
What do u replace it with after a revolution? Communism doesnt work capitalism is flawed democracy is flawed but seems to at least promote our freedoms. I think we defiantly need a fluid democracy before we can start thinking about how we solve the economic problems (well other than raising minimum wage that's a no brainer) without undermining exponential growth.
Capitalism isn't just flawed, it's broken. For every prosperous nation like the UK or Germany, there's half a dozen Haitis and Panamas.
By "communism", I presume you mean Marxist-Leninist state socialism, which indeed fails miserably. However, it isn't the only alternative to capitalism. Historically, there have been several communes during the Spanish and Russian civil wars that worked fine and didn't have a central leader, let alone a dictatorship. Although they died because of military blunders, this model is currently being followed more or less in Chiapas by the Zapatistas.
In these places, workers' councils ruled. Direct face-to-face democracy by neighbours were how most things were done. I recon that this is a fairly nice arrangement.
Democracy's flaws come from subversion by the wealthy and the fact that republics don't let people really participate, but rather choose people who participate in their place.
It was 41% of execs saying workforce will be replaced, not 41% of workforce will be replaced
We are walking talking general intelligence so we know it's possible for them to exist, the question is more if we can implement one using existing computational technology.
Iβve worked with humans, who have computer science degrees and 20 years of experience, and some of them have trouble writing good code and debugging issues, communicating properly, integrating with other teams / components.
I donβt see βAIβ doing this. At least not these LLM models everyone is calling AI today.
Once we get to Data from Star Trek levels, then I can see it. But this is not that. This is not even close to that.
People are always enthusiastic about automating others' jobs. Just like they are about having opinions on areas of knowledge utterly alien to them.
Say, how most see the work of medics.
And the fact that a few times in known history revolutions happened makes them confident that another one is just behind the corner, and of course it'll affect others and not them.
Hahahaha, good one
You know what I like about Pareto law and all the "divide and conquer" algorithms? You should still know where the division is and which 10% are more important than the other 90%.
Anyway, my job is in learning new stuff quickly and fixing that. Like of many-many people, even some non-technical types really.
People who can be replaced with machines have already been for the most part, and where they can't, it's also a matter of social pressure. Mercantilism and protectionism and guilds historically were defending the interests of certain parties, with force too.
No, I don't think there'll be a sudden "find out" different from any other period of history.
just 13 years after the first iPhone, and look how far smartphones have come
I disagree.
As someone who has the first iPhone, it was amazing and basically did everything that a new one does. It went on all websites, had banking apps and everything.
I would actually argue phones have become worse, they are very bloated and spy on you, at first they actually made your life better and there was no social media apps super charged for addiction.
Hype hype hype hype hype.
Hilarious L take
You know what I love about blocking people?
these are the same people who continue to use monetary incentives despite hard scientific evidence that it has the opposite effect from what is desired. they're not gonna realise shit.
The ones refusing to give raises and also being shocked and complain bitterly about loyalty when people quit for a higher wage somewhere else.
Seems to be working in Hollywood films for the last 20 years
Yeah the 59% in this survey are going to end up pretty successful and buy out the 41%
but recycles old ideas over and over.
I am so glad us humans don't do that. It's so nice going to a movie theater and seeing a truly original plot.
The Oncology pharma companies would love that! Every time I google symptoms I swear...
In my experience, 100% of executives don't actually know what their workforce does day-to-day, so it doesn't really surprise me that they think they can lay people off because they started using ChatGPT to write their emails.
This was my immediate thought too. Even people 2-3 levels of management above me struggle to understand our job let alone the person 5-6 levels up in the executive suite.
At my last job my direct manager had to explain to upper management multiple times that X role and Y role could not be combined because it would require someone to physically be in multiple places simultaneously. I think about that a lot when I hear about these corporate plans to automate the workforce.
However, people saying that C-suite can be replaced with GPTs don't understand that plenty of people not in C-suite could be replaced or not replaced just as well. Lots of office plankton around with such reasoning skills that I just don't know how their work can bring profit.
I can't decide whether those people are really needed or they are employed so that they wouldn't collectively lynch those of us who'd keep relevance, but wouldn't be social enough to defend from that doom.
The problem with building hierarchies of humans is with humans politicking and lying and scheming with each other, not even talking about usual stuff like friendship and sympathy and their opposites. It's just impossible to see what's really happening behind all that.
Well it's good to know 59% of execs are aware that AI isn't gonna change shit
Some of that 59% might, but I guarantee at least some very strongly think it will change things, but think the change it brings will require as many people as before (if not more), but that they will be doing exponentially more with the people they have.
The problem with that headline is that it doesn't feed the hype cycle.
Could be they just think there is productivity shortfall and current workforce + plus AI will help meet it. Or just lieing for PR.
With out more data its just guessing though
Can AI replace executives too?
Yes. And it will.
As soon as weβve managed to make a computer that can simulate an entire brain in real time. Who knows how many decades or even centuries will that take.
No. Middle management is a lot of repeating tasks that an AI could do.
The thing is that were not talking about replacing all middle management, we're talking about giving 10% of the managers the tools to run 90% of the repetitive, tedious and boring tasks.
To replace a corporate executive? No, I don't think so. We already have algorithms more than capable of replacing CEOs. There is nothing that challenging in what they do...
The challenge is to not do whatever the optimal algorithm says. If they simply did what an algorithm says, it would be very easy for competitors to predict.
The challenge comes in being a scapegoat for when things go wrong (albeit a goat with a golden parachute) and a hype man for when things go right.
But as others have said AI won't replace executives because it's executives making the decisions to use AI, and no one with power will ever choose an option that reduces their own money.
Well, the one in power might decide that they're spending too much on the managers below them.
Oh, but the board directors might want to replace the CEO anyway.
You make it sound like corporations invent a new revolutionary wheel each quarter.
They don't.
What fantastic new beverage have Coca Cola launched the last couple of years?
What astonishing new car technology has GM or Volkswagen released lately?
Most companies are doing what they've always have done and guarding their market share. Now and then some small competitor with something revolutionizing pops up and either starts eating market share it gets aquired by one the bigger ones.
So between a competition popping up or one of your engineers coming up with a lucky accident, all you do is to manage the business as you always do.
It's amazing how this delusion gets repeated so much in here. Absolute unhinged shit.
Yes.
The biggest factor in terms of job satisfaction is your boss.
There's a lot of bad bosses.
AI will be an above average boss before the decade is out.
I really want to see if worker owned cooperatives plus AI could do help democratize running companies (where appropriate). Not just LLMs, but a mix of techniques for different purposes (e.g., hierarchial task networks to help with operations and pipelining, LLM for assembling/disseminating information to workers).
Say execs. You know, the people who view labor as a cost center.
They say that because thatβs what they want to happen, not because itβs a good idea.
Freeing humans from toil is a good idea, just like the industrial revolution was. We just need our system to adapt and change with this new reality, AGI and universal basic income means we could live in something like the society in star trek.
Iβm sure thatβs what execs are talking about.
Doesn't matter what the execs say, it will happen and it will become easier and easier to start your own business. They are automating themselves out of a high paying job.
And only 41%.
I've advised past clients to avoid reducing headcount and instead be looking at how they can scale up productivity.
It's honestly pretty bizarre to me that so many people think this is going to result in the same amount of work with less people. Maybe in the short term a number of companies will go that way, but not long after they'll be out of business.
Long term, the companies that are going to survive the coming tides of change are going to be the ones that aggressively do more and try to grow and expand what they do as much as possible.
Effective monopolies are going out the window, and the diminishing returns of large corporations are going to be going head to head with a legion of new entrants with orders of magnitude more efficiency and ambition.
This is definitely one of those periods in time where the focus on a quarterly return is going to turn out to be a cyanide pill.
Yup, and there's a lot you can do to increase productivity:
less time wasted in useless meetings - I've been able to cut ours
more time off - less burnout means more productivity
flexible work schedules - life happens, and I'm a lot more willing to put in the extra effort today if I know I can go home early the next day
automate the boring parts - there are some fantastic applications of AI, so introduce them as tools, not replacements
profit sharing - if the company does well, don't do layoffs, do bigger bonuses or stock options
cut exec pay when times get hard - it may not materially help reduce layoffs, but it certainly helps morale to see your leaders suffering with you
And so on. Basically, treat your employees with respect and they'll work hard for you.
Short term is all that matters. Business fails? Start another one, and now you have a bunch of people that you made unemployed creating downward pressure on labor prices.
No, you have a lot of people you made unemployed competing with you.
This is already what's happening in the video game industry. A ton of people have lost their jobs, and VC money has recently come pouring in trying to flip the displaced talent into the next big success.
And they'll probably do it. A number of the larger publishers are really struggling to succeed with titles that are bombing left and right as a result of poor executive oversight on attempted cash grabs to please the short term market.
Look at Ubisoft's 5-year stock price.
Short term is definitely not all that matters, and it's a rude awakening for those that think it's the case.
Mostly the execs don't care. They've extracted "value" in the form of money and got paid, that's the extent if their ability to look forward. The faster they make that happen the faster they can do it again, probably somewhere else. They don't give a single shit what happens after.
It really depends on the exec.
Like most people, there's a range.
Many are certainly unpleasant. But there's also ones that buck the trend.
Yeah, and there are a few good lawyers and a few good cops and (probably) a few good politicians too, but we're not talking about the few exceptions here.
Well, we kind of are as the shitty ones tend to fail after time and the good ones continue to succeed, so in a market that's much more competitive because of a force multiplier on labor unlike anything the world has seen there's not going to be much room for the crappy execs for very long.
Bad execs are like mosquitos. They thrive in stagnant waters, but as soon as things get moving they tend to reduce in number.
We've been in a fairly stagnant market since around 2008 for most things with no need for adaptation by large companies.
The large companies that went out of business recently have pretty much all been from financial mismanagement and not product/market fit like Circuit City or Blockbuster from the last time adaptation was needed with those failing to adapt going out of business.
The fatalism on Lemmy is fairly exhausting. The past decade shouldn't be used as a reference point for predicting the next decade. The factors playing into each couldn't be more different.
I just want to say I appreciate your informed opinions in contrast to the doom and gloomerism combined with class warfare that is so pervasive here.
Scaling up productivity is what tends to lead to layoffs. Having the exact same output but with fewer employees is pretty much guaranteed to lower cost and increase profit, so that's what most execs are likely to do. Short-sited maybe, but businesses are explicitly short-sited, only focusing on the next quarter.
How do you arrive at effective monopolies are going out the window, squaring it with what we see in the world today which runs counter.
There's diminishing returns on labor for large companies and an order of magnitude labor multiplier in the process of arriving.
For example, if you watched this past week's Jon Stewart, you saw an opening segment about the threat of AI taking people's jobs and then a great interview with the head of the FTC talking about how they try to go after monopolistic firms. One of the discussion points was that often when they go up against companies that can hire unlimited lawyers they'll be outmatched by 10:1.
So the FTC with 1,200 employees can only do so much, and the companies they go up against can hire up to the point of diminishing returns on more legal resources.
What do you think happens when AI capable of a 10x multipler in productivity at low cost is available for legal tasks? The large companies are already hiring to the point there's not much more benefit to more labor. But the FTC is trying to do as much as they can with a tenth the resources.
Across pretty much every industry companies or regulators a fraction of the size of effective monopolies are going to be able to go toe to toe with the big guys for deskwork over the coming years.
Blue collar bottlenecks and physical infrastructure (like Amazon warehouses and trucks) will remain a moat, but for everything else competition against Goliaths is about to get a major power up.
Can't wait for AI to replace all those useless execs and CEOs. It's not like they even do much anyways, except fondling their stocks. They could probably be automated by a markov chain
If they could replace project managers that would be nice. In theory it is an important job, but in practice it's just done by someone's mate who was most productive when they don't actually turn up.
The Paranoia RPG has a very realistic way of determining who gets to be the leader of a group. First, you pick who'll do what kind of job (electronics, brute force, etc). Whoever didn't get picked becomes the leader, as that person is too dumb to do anything useful.
Yes that's quite a funny and satirical way of doing it but it's probably not actually the best way in real life.
I think Boeing have proven this quite nicely for everyone, the company was much better off when they had actual engineers in charge. When they got corporate paper pushes everything went downhill.
I have been on enough projects where engineers were in charge that went to hell to know that isnt always a solution. And yes I am an engineer.
One of the projects I am on now the main lead is full PE civil and its a manmade clusterfuck well behind schedule, overbudget, and several corporate bridges burned. Haven't even started digging yet.
By far the very biggest cluster fuck I was ever on was run by a Chemical Engineer. A 40 million dollar disaster that never should have been even considered.
Being good at technical problems (which frankly most of us aren't) doesn't mean you know how to do anything else.
I have had good ones and not so good ones.
I swear people don't know the difference between a good project manager and a bad one, or no one.
Everyone on here is on about how the.board has no idea what the bottom rungs of the ladder do and are all "haha they are so stupid they think we do nothing". Then in the next sentence say they don't know what the board does and that they just do nothing.
Project managers on board members what the hell you want about
People slagging off jobs they don't understand.
Both project managers that they probably have experience with dealing with but don't understand and board members they probably don't have any experience with and also don't understand.
Board members don't do shit
I see.
What is this judgment based on?
First hand experience
Don't get a job in government contracting. Pretty much I do the work and around 5 people have suggestions. None of whom I can tell to fuck off directly.
Submit the drawing. Get asked to make a change to align with a spec. Point out that we took exception to the spec during bid. Get asked to make the change anyway. Make the change. Get asked to make another change by someone higher up the chain of five. Point out change will add delays and cost. Told to do it anyway. Make the next change....
Meanwhile every social scientist "we don't know what is causing cost disease"
if a manager says that instead of seeing the opportunity to reassign staff and expand, the manager needs to be replaced by AI immediately
Just become an AI consultant and triple your salary.
AI will (be a great excuse to) reduce workforce, say 41% of people who get bonuses if they do.
Game's changed. Now we fire people, try to rehire them for less money and if that doesn't work we demand policy changes and less labour protection to counter the "labour shortage".
Labor shortage is such a funny term. It's like coming to a store and looking for 1kg of meat for 1$, not finding it and saying there's meat shortage. Or coming to a vegetarian store and looking for 1kg of any meat and saying the same.
When everybody is employed, but the economy needs more people - that's labor shortage. When there are people looking for jobs, but not satisfied with particular offerings - that's something else.
If Gartner comes out with a decent AI model, you could replace over half of your CIOs, CISOs, CTOs, etc. Most of them lack any real leadership qualities and simply parrot what they're told/what they've read. They're their through nepotism.
Also, most of them use AI as a crutch, so that's all they know. Meanwhile, the rest of us use it as a tool (what it's meant to be).
Christ, if you think a CTO is hard to deal with, wait until you have to interface with the AI CTO.
As long as i can prompt-engineer my way into twice the salary for half the hours, that might still be worth it!
Yup. The owners can save a lot of money on those paychecks.
Won't tho.
simply parrot what theyβre told/what theyβve read.
That's exactly what an LLM is
But the AI can do it cheaper
But their job is to be the fall guy.
41% execs think that a huge amount of class power will go from workers in general to AI specialists (and probally the companies they make or that hire them).
I personally can't wait for a lot these businesses that bet on the wrong people to replace turn around and form new competition but with this new tech filling in the gaps of middle management, hr, execs, etc.
I mean its fucking meme, but an AI assisted workplace democracy seems alright to me on paper (the devils in details).
Execs don't give a shit. They simply double down on the false cause fallacy instead. They wouldn't ever admit they fucked up.
Last year the company I work for went through a run of redundancies, claiming AI and system improvements were the cause. Before this point we were growing (slowly) year on year. Just not growing fast enough for the shareholders.
They cut too deep, shit is falling apart, and we're loosing bids to competitors. Now they've doubled down on AI, claiming blindness to the systems issues they created, and just made an employee's "Can Do" attitude a performance goal.
Optimising for the oblivious or unscrupulous, nice.
You sound like you work from one of my part suppliers
Lets try it. I am willing to start a worker coop headed by votes and an AI. Fuck it.
59% of execs are wrong.
I think that's a little low.
They'll be replaced with AI
Thankfully
I don't even wanna work. I just wanna live and if that's not possible, exist.
Same. I welcome our AI overlords as long as that means I can just stay at home and fully embrace my autism by not giving a fuck about the workforce while studying all of the thousands of subjects I enjoy learning about.
Not a thing til the revolution, dear.
I say AI overlords might be an improvement over the human overlords that have persisted throughout human history.
The AI overlords will be trained on data based on human overlords decisions and justifications. We are fucked, my man.
They won't be though because the managers don't know anything about AI. People who actually train the AI will be some poor sap in IT who's been lumbered with a job they don't want, because AI is computers right.
So I'm going to train it on good stuff written by professionals, Star Trek episodes, and make it watch War Games.
The managers don't even have any data sets the AI could absorb anyway because most of their BS is in person, and so not recorded for analysis.
Oh my. I see you don't know mich about the hell called key performance indicators...
Key performance indicators will be what will turn our AI overlords into AI tyrants. And there is so so much data available for training the AIs.
The autism is not required. No one cares about their jobs, especially people who work in jobs where "everyone is a family". People care about those jobs the least.
I will never care if AI takes mandatory work from me, but I want income replacement lol. Seriously though I hate working so much every job I've ever had has made me suicidal at some point. I'm glad there's a chance at least I won't have nothing but work and death ahead of me. If that's all that's left it's okay, a little disappointing but it is what it is.
Not allowed. Work or die, im afraid.
And that means lower prices for consumers. Right? Guys.. r.. right?
And that means lower prices for consumers. Right? Guys.. r.. right?
No, but it does mean 41%fewer people can afford to buy these companies products, you cheapass shortsighted corporate fucks.
41% is the number of executives that think AI will reduce their work force, not the number of jobs they expect to replace.
Your point stands though.
More businesses will be started to make the products since the profit margin is suddenly so high... driving down prices.
Execs? The same people who make short sighted decisions and don't understand basic psychology? Let me go get a pen so I won't...give two fucks what this bogus survey says. Let AI run your business so I can have some excitement in my life
They donβt care. Jack Welchβs ghost must be fed by destroying more companies for short term gain.
As someone scripting a lot for my department in the tech industry, yea AI and scripts have a lot of potential to reduce labor. However, given how chaotic this industry is, there will still need to be humans to take into account the variables that scripts and AI haven't been trained on (or are otherwise hard to predict). I know the managers don't wanna spend their time on these issues, as there's plenty more for them to deal with. When there's true AGI, that may be a different scenario, but time will tell.
Currently, we need to have some people in each department overseeing the automations of their area. This stuff mostly kills the super redundant data entry tasks that make me feel cross eyed by the end of my shift. I don't wanna be the embodiment of vlookup between pdfs and type the same number 4+ times.
exactly, this will eliminate some jobs, but anyone who's asked an LLM to fix code longer than 400 lines knows it often hurts more than it helps.
which is why it is best used as a tool to debug code, or write boilerplate functions.
Do you think AI for programmers will be like CAD was for drafters? It didnβt eliminate the position, but allows fewer people to do more work.
this is pretty much what i think, yeah.
a lot of programming/software design is already kinda that anyway. it's a bunch of people who were educated on computer science principles, data structures, mathematicians, and data analytics/stats who write code to specs to solve very specific tool problems for very specific subsets of workers, and who maintain/update legacy code written decades ago.
now, yeah, a lot things are coded from scratch, but even then, you're referencing libraries of code written by someone awhile ago to solve this problem or serve this purpose or do thing, output thing. that's where LLMs shine, imo.
No. More high-level languages with less abstraction leakage are like CAD for drafters. Not "AI".
I personally would want such tools to be more visual and more like systems, not algorithms.
Like interconnected nodes in a control system. Like PureData for music, or like LabView. Maybe more powerful and general-purpose.
Youβll get blindsided real quick. AIs are just getting better. OpenAI are already saying they moved past GPT for their next models. Itβs not 5 years before it can fix code longer than 400 lines, and not 20 before it can digest a specification and spout a working software. Said software might not be optimized or pretty, but those are things people can work separately. Where you needed 20 software engineers, youβll need 10, then 5, then 1-2.
You have more in common with the guy getting replaced today than you care to admit in your comment.
Edit: not sure why Iβm getting downvoted instead of having a discussion, but good luck to you all in your careers.
i didn't downvote you, regardless internet points don't matter.
you're not wrong, and i largely agree with what you've said, because i didn't actually say a lot of the things your comment assumes.
the most efficient way i can describe what i mean is this:
LLMs (this is NOT AI) can, and will, replace more and more of us. however, there will never, ever be a time where there will be no human overseeing it because we design software for humans (generally), not for machines. this requires integral human knowledge, assumptions, intuition, etc.
LLMs (this is NOT AI)
I disagree. When I was studying AI at college 20+ years ago we were also talking about expert systems which are glorified if/else chains. Most experts in the field agree that those systems can also be considered AI (not ML though).
You may be thinking of GAI or Universal AI which is different. I am a believer in the singularity (that a machine will be as creative and conscious as a human), but thatβs a matter of opinion.
I didnβt downvote you
I was using βyouβ more towards the people downvoting me, not you directly. You can see the accounts who downvoted/upvoted, btw.
Edit: and I assumed the implication of your comment was that βpeople who code are safeβ, which is a stretch I was answering to. Your comment was ambiguous either way.
jesus christ you should be shoved into a locker
Wow. Thanks for the advice. I guess thatβs just Lemmy showing me the door. Good luck with your community here.
Try not to let the bot hurt your feelings, it was trained on cunts 'n' assholes
Where you needed 20 software engineers, youβll need 10, then 5, then 1-2.
It's an open secret that this is already the case. I have seen projects that went on for decades and only required the engineering staff they had because corporate bureaucracy and risk aversion makes everyone a fraction as effective as they could be, and, frankly, because a lot of ineffective morons got into software development because of the $$$ they could make.
Unless AI somehow eliminates corporate overhead I don't understand how it'll possibly make commercial development monumentally easier.
Scripting is one thing and unpredictable plagiarism generator is another.
If you mean ML text recognition, ML classification etc - then yeah, why not.
Hereβs a thought: letβs get rid of 41% of execs instead.
Let's get rid of corporate profits for shareholders. If your actually want to fix the problem. Make it illegal for shareholders to profit more than employees
People here keep belittling AI. You're all wrong, at least when considering the long run... We can't beat it. We need to outlaw it.
Train it to replace CEO's.
It's SchrΓΆdinger's AI. It is both useless and will replace everyone. Depending on the agenda the particular person is trying to push.
We need to outlaw it.
Train it to replace CEO's.
Oh, there it goes again.
I know it's getting boring. I am tried of people telling me how chatgpt and friends are toys that just spit back website data and in the same comment telling me how they are basically angry gods ready to end the human race.
Fucking make up your mind!
"Smash the looms" is the wrong idea.
"Eat the rich" might have some merit though.
Yeah, donβt smash the looms, seize them. The ability to make labor easier and more efficient is a positive if we donβt allow it to be a means to impoverish the workers
Nah, I disagree on both counts.
We can't beat it. We need to outlaw it.
Is the intent here to preserve jobs even if it's less productive? That's solving the wrong problem. Instead of banning it, we should be adapting to it. If AI is more efficient than people, the jobs people take should change.
I think there's a solid case that if something would devolve into rent-seeking because competition is unproductive, it should be provided as a public service. Do you need a job if all of your basic needs are met by AI? At that point, any work you do would be optional, so people would follow their passions instead of working to make ends meet (see: Star Trek universe).
Think of it like Basic Income, but instead of cash, you'd get services at-cost. I think there's room for non-profits (or maybe the government) to provide these AI-services at-cost.
Outlawing it is a very dangerous aim, because outlawing it completely will enable other countries to out-compete us, and a outlawing it completely is right next to "outlaw it for normal people, but allow companies to exploit it for profit" on the dart board of possibilities.
Better path all around is "allow everyone to use AI and establish strong social safety nets and move towards enabling people to work less".
Haven't I been hearing that since the rise of computing and the internet? And it's probably been around even longer. Seems like this sort of stuff only gets going when a lot of workers start putting up a fight.
But hey, maybe 41% jobs lost might be the tipping point. Because people aren't just gonna sit on the sidewalk and starve.
If AI is outlawed, only outlaws will have AI
Y'all are dumbass doomers. Have some fun with AI while your can you some aged peasants. We were always fucked.
There is no denial a.i. is going to replace or significantly reduce some jobs. But I predict it's going to happen mostly in bullshit job like marketing, advertisement, the kind of journalism that repeat the same news from other reputed newspaper.
A.i. isn't going to replace the migrants that lay bricks in front on me, it's not going to replace their chief.
Itβll reduce the workforce from well-remunerated professionals who perform tasks to a larger number of disposable minimum-wage labourers who clean up botshit.
Pretty sure the entire Republican party and the ruling class they serve just orgasmed at that thought.
Biz leaders optimistic it can reduce living, breathing cost centers... er, valued workers
And aggregate demand needed to buy the shit they produce. But that's not this corpo's problem. Not until most corpos are doing it.
AI will remove 41% of execs, say 100% of people who know what AI is.
What's really interesting this time around is AI will cut middle management and paper pushers. Those are typically very good middle class jobs.
Unlike manufacturing, those people really don't have transferable skills. They can't go become mechanics or plumbers.
AI is going to hurt.
The jobs AI would be best at eliminating are HR and management. Instead, corpos give these shitters the power to eliminate other positions and then they still act like HR and management are the people producing value.
This is the risk and it has happened before.
The AI won't do my job exactly, but managers mostly manage, i.e. deal with organisational overhead. That Excel you've been maintaining for the past decade was never as crucial to the business' success as you made it appear. It was something the higher ups liked to talk about with pretty charts. An UI can generate other things to talk about from the same data.
I don't agree that those people don't have transferable skills, but I agree that's going to hurt. Like flattening hierachies, self-organised teams and outsourcing, previously cushy jobs will be replaced with more stressful ones.
You used to have a secretary to make calls for you and organise your calender. Now you have copilot and customers call you directly.
You don't need powerful AI or anything for this to happen. They just stopped hiring secretarial staff when managers learned how to use a computer.
I always ask myself who will buy the products these companies produce if all the workers have been fired. Maybe inflation is just the natural ramp up to McDonald's charging 5,000 dollars for automated chicken nuggets when there are only billionaire left with money lol.
When it's cheaper to make the products because you don't have to pay anyone, people will look at that manufacturer and think... wow I can start a business like that and make an easy profit? Competition will drive down prices.
I think their point is that when everyone's income is $0/hr price becomes pretty much irrelevant (unless also $0)
If they gain decent market share, they will be bought by one of the two or three companies that owns the entirety of that manufacturing category. If they don't, the incumbents will lower prices until the new thing is out of business. In either case, the prices bounce back, and even increase because of "inflation."
Always be competition if the profit margin is large
Economics of scale. Doesn't work like that.
Lowering prices to kill competition is kinda illegal, but due to, again, economics of scale they have to lower those to illegal levels. One customer interaction in average brings more profit and less expense with larger scale.
Economics of scale exactly. An industry that is easy to get into(thanks to ai) that has high profit margins will attract competition just like always.
An industry that is easy to get into(thanks to ai)
I don't know what you mean by that, it doesn't make anything easier yet and there's no reason to think it will.
It absolutely makes it easier. It can answer you questions and present the same info large corporations used to have sole access to.
Example would be starting an entertainment content provider. You no longer need artists or writers. It can be a one person show instead of a team of 6.
Architecture firm...fewer designers. Fewer lawyers need hired.
Advertising firm. Fewer employees for sure. More avenues explored than ever possible before.
It can answer you questions and present the same info large corporations used to have sole access to.
No, it's a glorified googling and plagiarism machine.
You can't trust its answers on what you don't already know, so can't check. Which makes it useless for that, because what you already know, you already do.
Architecture firmβ¦fewer designers. Fewer lawyers need hired.
Firm "no" for both cases. Replacing expertise is something they don't do.
You no longer need artists or writers.
The quality of that content, though, would be such that people are going to pay for tools to avoid seeing it. Not even talking about litigation for plagiarism (which I'm ideologically against, so).
It can be used to assist existing artists and people of other occupations, surely. Or to be used in classification or text and voice recognition. Or even in data compression.
But it's not a revolutionary tool and it's not going to become one. Same as blockchain, a wasteful frontal assault way at imitating something wonderful which it isn't.
Advertising firm. Fewer employees for sure.
Yes, that's possible. And other such trash.
You just don't know what you are talking about. These are not quirks or bugs to be ironed out, these are inherent traits of such a machine.
There were "machine servants" in Antiquity which would pour wine. And there were analog toys which would recognize simple voice commands and react in 70s and 80s (you can find lots of wonders to make in journals for kids interested in radioelectronics from that time ; also if "inventions" from Heinlein's "Door into Summer" seem naive to you when reading it now, I hope it's interesting to know that they actually were very plausible even back then). This is just a very complex version of the same.
You have such a low opinion of a technology that, still in its embryonic stage, has already found its way into a large percentage of corporations that almost never adopt a tech this quickly. Already causing layoffs after so little time. Give it only a few more years.
Edit: not sure what you mean by machine servents. That tech has nothing to do with how ai is affecting the world right now.
You should try reading. I don't care about your opinions, but I could care about technical arguments if there would be any
There's one missing piece here, and it's startup capital. You don't usually see new chemicals manufacturers for instance, because you need a lot of money to buy everything to start with.
Not only that, also for paying royalties and money for legal protection when somebody sues you just to dig you in, or for consulting lawyers so that nobody would sue you.
There are lots of problems with running a glamorous official business. Mom&pop shop - yeah, you can.
Large existing corporations will expand the industries they are involved in and take advantage of the ease of entry with AI. The have plenty of capital and won't sit on their hands when golden opportunities are a few purchases away.
Must be nice to just throw "A" in front of all your half-baked personal plans to instantly justify them.
Surely, this canβt and wonβt backfireβ¦ /s
I never had the impression that there were enough people for the amount of work anyways. I don't see jobs go, but shift. Most developers will be fine, because of never ending work, AI is just a tool speeding things up. But not that much, as someone who is good with Google and git, is just a bit slower to find the same answers. And AI needs verification too, even if it links you directly to the issue at hand, via source url.
AI will create new issues. Some of the low level requirement jobs will go, like working in first level support, but only if you learn the AI yourself, else it's too generic. We're not there yet, where companies learn their own LLM yet. some outlier try.
We got to understand that there's still a human layer and a lot of people might prefer calling a human, even if the result is worse, simply because we're social beings. This can cost a lot of customers, if companies believe they can just shove an AI in front.
No one really knows how good AI will get. As the technology advances, we find more and more hard to solve issues, for instance that AI will make things up or gives wrong answers, despite knowing the real answer, if you pressure hard enough.
Also for security reasons you can't add AI everywhere, unless you want to send all secrets directly to Microsoft, Google or Facebook.
My 5 cents.
Missing the point.
AI won't so much replace labor as make it more fungible, and thus exploitable/abusable.
Except where its used as an excuse to just... Not. "Yes we have customer service; its just all chatgpt with no permissions" so nobody can ever return shit that was delivered broken.
After reading this article that got posted on Lemmy a few days ago, I honestly think we're approaching the soft cap for how good LLMs can get. Improving on the current state of the art would require feeding it more data, but that's not really feasible. We've already scraped pretty much the entire internet to get to where we are now, and it's nigh-impossible to manually curate a higher-quality dataset because of the sheer scale of the task involved.
We also can't ask AI to curate its own dataset, because that runs into model collapse issues. Even if we don't have AI explicitly curate its own dataset, it's highly likely going to be a problem in the near future with the tide of AI-generated spam. I have a feeling that companies like Reddit signing licensing deals with AI companies are going to find that they mostly want data from 2022 and earlier, similar to manufacturers looking for low-background steel to make particle detectors.
We also can't just throw more processing power at it because current LLMs are already nearly cost-prohibitive in terms of processing power per query (it's just being masked by VC money subsidizing the cost). Even if cost wasn't an issue, we're also starting to approach hard limits in physics like waste heat in terms of how much faster we can run current technology.
So we already have a pretty good idea what the answer to "how good AI will get" is, and it's "not very." At best, it'll get a little more efficient with AI-specific chips, and some specially-trained models may provide some decent results. But as it stands, pretty much any organization that tries to use AI in any public-facing role (including merely using AI to write code that is exposed to the public) is just asking for bad publicity when the AI inevitably makes a glaringly obvious error. It's marginally better than the old memes about "I trained an AI on X episodes of this show and asked it to make a script," but not by much.
As it stands, I only see two outcomes: 1) OpenAI manages to come up with a breakthrough--something game-changing, like a technique that drastically increases the efficiency of current models so they can be run cheaply, or something entirely new that could feasibly be called AGI, 2) The AI companies hit a brick wall, and the flow of VC money gradually slows down, forcing the companies to raise prices and cut costs, resulting in a product that's even worse-performing and more expensive than what we have today. In the second case, the AI bubble will likely pop, and most people will abandon AI in general--the only people still using it at large will be the ones trying to push disinfo (either in politics or in Google rankings) along with the odd person playing with image generation.
In the meantime, what I'm most worried for are the people working for idiot CEOs who buy into the hype, but most of all I'm worried for artists doing professional graphic design or video production--they're going to have their lunch eaten by Stable Diffusion and Midjourney taking all the bread-and-butter logo design jobs that many artists rely on for their living. But hey, they can always do furry porn instead, I've heard that pays well~
I find the way that you write peculiar, in a good way. I mean no offense, but is English your secondary language?
Yeah, it's my second language. Sorry I wrote it a minute before bed, sometimes sentences become even weirder then. I went back and added some more commas. Haha
This is why not every business is successful I guess
π΅Dumb Dumb Dumb Dumb DumbπΆ
This is the best summary I could come up with:
A survey of senior biz executives reveals that 41 percent expect to have a smaller workforce in five years due to the implementation of AI technologies.
The research from staffing provider and recruitment agency Adecco Group found a "buy mindset" around AI, which "could exacerbate skills scarcity and create a two-speed workforce."
The figure is highest in Germany and France, where 49 percent of respondents say their company will employ fewer people in five years because of AI.
Seventy-eight percent of respondents say GenAI will play a "critical role in providing upskilling and development opportunities."
"While there is no denying that commercial interest in AI has been driven by its ability to reduce headcounts, the disruption will be a positive one β these industries have been suffering from decades-long skills crises, short on talent due to the high barriers to entry.
"Robotic engineers, data governors, drug discovery analysts β these are the jobs tomorrow that rely on AI," she told us.
The original article contains 438 words, the summary contains 161 words. Saved 63%. I'm a bot and I'm open source!
Imo when you make an industry easier for the managers/ceos using ai and have fewer workers, it will also be easier for people to create competition in that industry... driving down prices.
And as a result the remaining workforce will have to work more.
Eventually the entire economy will be just one overworked australian man.
Quiet, mate. Pulling the empty carts is the closest thing we get to sleep.
And he'll still be underpaid.
Oi did I hear my name, mate?
Who do these assholes think will buy their products and services when they put the entire workforce out of work? Do they plan to retreat to their bunkers and live out their days underground while the world burns above?
Can't wait for the 94% unemployed to raid the banks and eat the Bankers.
So, when am I gonna see some of all that UBI?
Hahahahahaha...
It's cheaper if we starve to death.
ITT: bunch of people who have no idea what AI even means
This is kind of like the early days of computers or internet all over again. LLMs is not what educated people mean when they're talking about AI. ChatGPT is not going to take your jobs, AGI will. Nobody just knows when. Might be next year or it might take 2 decades.
Enlighten us, oh knowledgeable one, what is AI?
β41% of execs display anomalous sexual prowess, 9β dongs thought to play a role.β
My money is on cyberdongs
If it isn't yet, AI will be calling the shots on the actual money owners (those big investment companies like Blackrock). Invest here, invest there, demand more from elsewhere. Said AI will then dictate who should be appointed CEO, director, etc, because it will be asked to name "a human" and little Timmy McMeritocracy, son of a high up elsewhere, needs his first job, nevermind that putting an AI in his place would be more profitable.
Bye bye middle management!
But seriously, work will always expand to the available workforce. That's why there are so many stupid industries. They always tank during a resession, but other industries will expand to use excess labor.
I predict a huge demand of workforce in five years, when they finally realized AI doesn't drive innovation, but recycles old ideas over and over.
I predict execs will never see this despite you being correct. We replaced most of our HR department with enterprise GPT-4 and now almost all HR inquiries where I work is handled through a bot. It daydreams HR policies and sometimes deletes your PTO days.
But can you convince it to report itself for its violations if you phrase it like it's a person?
No unfortunately. A lot of us fucked with it but it keeps logs of every conversation and flags abusive ones to management. We all got a stern talking to about it afterwards.
"Trust your tools". Not my fault the hammer was replaced by a banana.
I give you permission to replace HR with chatgpt. It just can't be any worse.
"Workforce" doesn't produce innovation, either. It does the labor. AI is great at doing the labor. It excels in mindless, repetitive tasks. AI won't be replacing the innovators, it will be replacing the desk jockeys that do nothing but update spreadsheets or write code. What I predict we'll see is the floor dropping out of technical schools that teach the things that AI will be replacing. We are looking at the last generation of code monkeys. People joke about how bad AI is at writing code, but give it the same length of time as a graduate program and see where it is. Hell, ChatGPT has only been around since June of 2020 and that was the beta (just 13 years after the first iPhone, and look how far smartphones have come). There won't be a huge demand for workforce in 5 years, there will be a huge portion of the population that suddenly won't have a job. It won't be like the agricultural or industrial revolution where it takes time to make it's way around the world, or where this is some demand for artisanal goods. No one wants artisanal spreadsheets, and we are too global now to not outsource our work to the lowest bidder with the highest thread count. It will happen nearly overnight, and if the world's governments aren't prepared, we'll see an unemployment crisis like never before. We're still in "Fuck around." "Find out" is just around the corner, though.
Even mindless and repetitive tasks require instances of problem solving far beyond what a.i is capable of. In order to replace 41% of the work force youβll need a.g.i and we donβt know if thats even possible.
Let's also not forget that execs are horrible at estimating work.
"Oh this'll just be a copy paste job right?" No you idiot this is a completely different system and because of xyz we can't just copy everything we did on a different project.
Or salesmen. "Oh, you have that another system to integrate with? No, no change in estimates, everything is OK."
Then they have a deal concluded etc, and then suddenly that information reaches the people who'll be actually doing it.
Its not replacing people outright its meaning each person is capable of doing more work each thus we only need 41% the people to achieve the same task. It will crash the job market. Global productivity and production will improve then ai will be updated repeat. Its just a matter of if we can scale industry to match the total production capacity of people with ai assistance fast enough to keep up. Both these things are currently exponential but the lag may cause a huge unemployment crisis in the meantime.
In this potential scenario, instead of axing 41% of people from the workforce, we should all get 41% of our lives back. Productivity and pay stay the same while the benefits go to the people instead of the corporations for a change. I know thatβs not how it ever works, but we can keep pushing the discussion in that direction.
You and I know damn well that a revolution is the only way that's gonna happen, and there aren't any on the horizon.
What do u replace it with after a revolution? Communism doesnt work capitalism is flawed democracy is flawed but seems to at least promote our freedoms. I think we defiantly need a fluid democracy before we can start thinking about how we solve the economic problems (well other than raising minimum wage that's a no brainer) without undermining exponential growth.
Capitalism isn't just flawed, it's broken. For every prosperous nation like the UK or Germany, there's half a dozen Haitis and Panamas.
By "communism", I presume you mean Marxist-Leninist state socialism, which indeed fails miserably. However, it isn't the only alternative to capitalism. Historically, there have been several communes during the Spanish and Russian civil wars that worked fine and didn't have a central leader, let alone a dictatorship. Although they died because of military blunders, this model is currently being followed more or less in Chiapas by the Zapatistas.
In these places, workers' councils ruled. Direct face-to-face democracy by neighbours were how most things were done. I recon that this is a fairly nice arrangement.
Democracy's flaws come from subversion by the wealthy and the fact that republics don't let people really participate, but rather choose people who participate in their place.
It was 41% of execs saying workforce will be replaced, not 41% of workforce will be replaced
We are walking talking general intelligence so we know it's possible for them to exist, the question is more if we can implement one using existing computational technology.
Iβve worked with humans, who have computer science degrees and 20 years of experience, and some of them have trouble writing good code and debugging issues, communicating properly, integrating with other teams / components.
I donβt see βAIβ doing this. At least not these LLM models everyone is calling AI today.
Once we get to Data from Star Trek levels, then I can see it. But this is not that. This is not even close to that.
People are always enthusiastic about automating others' jobs. Just like they are about having opinions on areas of knowledge utterly alien to them.
Say, how most see the work of medics.
And the fact that a few times in known history revolutions happened makes them confident that another one is just behind the corner, and of course it'll affect others and not them.
Hahahaha, good one
You know what I like about Pareto law and all the "divide and conquer" algorithms? You should still know where the division is and which 10% are more important than the other 90%.
Anyway, my job is in learning new stuff quickly and fixing that. Like of many-many people, even some non-technical types really.
People who can be replaced with machines have already been for the most part, and where they can't, it's also a matter of social pressure. Mercantilism and protectionism and guilds historically were defending the interests of certain parties, with force too.
No, I don't think there'll be a sudden "find out" different from any other period of history.
I disagree.
As someone who has the first iPhone, it was amazing and basically did everything that a new one does. It went on all websites, had banking apps and everything.
I would actually argue phones have become worse, they are very bloated and spy on you, at first they actually made your life better and there was no social media apps super charged for addiction.
Hype hype hype hype hype.
Hilarious L take
You know what I love about blocking people?
these are the same people who continue to use monetary incentives despite hard scientific evidence that it has the opposite effect from what is desired. they're not gonna realise shit.
The ones refusing to give raises and also being shocked and complain bitterly about loyalty when people quit for a higher wage somewhere else.
Seems to be working in Hollywood films for the last 20 years
Yeah the 59% in this survey are going to end up pretty successful and buy out the 41%
I am so glad us humans don't do that. It's so nice going to a movie theater and seeing a truly original plot.
The Oncology pharma companies would love that! Every time I google symptoms I swear...
In my experience, 100% of executives don't actually know what their workforce does day-to-day, so it doesn't really surprise me that they think they can lay people off because they started using ChatGPT to write their emails.
This was my immediate thought too. Even people 2-3 levels of management above me struggle to understand our job let alone the person 5-6 levels up in the executive suite.
At my last job my direct manager had to explain to upper management multiple times that X role and Y role could not be combined because it would require someone to physically be in multiple places simultaneously. I think about that a lot when I hear about these corporate plans to automate the workforce.
However, people saying that C-suite can be replaced with GPTs don't understand that plenty of people not in C-suite could be replaced or not replaced just as well. Lots of office plankton around with such reasoning skills that I just don't know how their work can bring profit.
I can't decide whether those people are really needed or they are employed so that they wouldn't collectively lynch those of us who'd keep relevance, but wouldn't be social enough to defend from that doom.
The problem with building hierarchies of humans is with humans politicking and lying and scheming with each other, not even talking about usual stuff like friendship and sympathy and their opposites. It's just impossible to see what's really happening behind all that.
Well it's good to know 59% of execs are aware that AI isn't gonna change shit
Some of that 59% might, but I guarantee at least some very strongly think it will change things, but think the change it brings will require as many people as before (if not more), but that they will be doing exponentially more with the people they have.
The problem with that headline is that it doesn't feed the hype cycle.
Could be they just think there is productivity shortfall and current workforce + plus AI will help meet it. Or just lieing for PR.
With out more data its just guessing though
Can AI replace executives too?
Yes. And it will.
As soon as weβve managed to make a computer that can simulate an entire brain in real time. Who knows how many decades or even centuries will that take.
No. Middle management is a lot of repeating tasks that an AI could do. The thing is that were not talking about replacing all middle management, we're talking about giving 10% of the managers the tools to run 90% of the repetitive, tedious and boring tasks.
To replace a corporate executive? No, I don't think so. We already have algorithms more than capable of replacing CEOs. There is nothing that challenging in what they do...
The challenge is to not do whatever the optimal algorithm says. If they simply did what an algorithm says, it would be very easy for competitors to predict.
The challenge comes in being a scapegoat for when things go wrong (albeit a goat with a golden parachute) and a hype man for when things go right.
But as others have said AI won't replace executives because it's executives making the decisions to use AI, and no one with power will ever choose an option that reduces their own money.
Well, the one in power might decide that they're spending too much on the managers below them.
Oh, but the board directors might want to replace the CEO anyway.
You make it sound like corporations invent a new revolutionary wheel each quarter. They don't.
What fantastic new beverage have Coca Cola launched the last couple of years? What astonishing new car technology has GM or Volkswagen released lately?
Most companies are doing what they've always have done and guarding their market share. Now and then some small competitor with something revolutionizing pops up and either starts eating market share it gets aquired by one the bigger ones.
So between a competition popping up or one of your engineers coming up with a lucky accident, all you do is to manage the business as you always do.
It's amazing how this delusion gets repeated so much in here. Absolute unhinged shit.
Yes.
The biggest factor in terms of job satisfaction is your boss.
There's a lot of bad bosses.
AI will be an above average boss before the decade is out.
You do the math.
Managers are easier to replace than workers
I really want to see if worker owned cooperatives plus AI could do help democratize running companies (where appropriate). Not just LLMs, but a mix of techniques for different purposes (e.g., hierarchial task networks to help with operations and pipelining, LLM for assembling/disseminating information to workers).
Say execs. You know, the people who view labor as a cost center.
They say that because thatβs what they want to happen, not because itβs a good idea.
Freeing humans from toil is a good idea, just like the industrial revolution was. We just need our system to adapt and change with this new reality, AGI and universal basic income means we could live in something like the society in star trek.
Iβm sure thatβs what execs are talking about.
Doesn't matter what the execs say, it will happen and it will become easier and easier to start your own business. They are automating themselves out of a high paying job.
And only 41%.
I've advised past clients to avoid reducing headcount and instead be looking at how they can scale up productivity.
It's honestly pretty bizarre to me that so many people think this is going to result in the same amount of work with less people. Maybe in the short term a number of companies will go that way, but not long after they'll be out of business.
Long term, the companies that are going to survive the coming tides of change are going to be the ones that aggressively do more and try to grow and expand what they do as much as possible.
Effective monopolies are going out the window, and the diminishing returns of large corporations are going to be going head to head with a legion of new entrants with orders of magnitude more efficiency and ambition.
This is definitely one of those periods in time where the focus on a quarterly return is going to turn out to be a cyanide pill.
Yup, and there's a lot you can do to increase productivity:
And so on. Basically, treat your employees with respect and they'll work hard for you.
Short term is all that matters. Business fails? Start another one, and now you have a bunch of people that you made unemployed creating downward pressure on labor prices.
No, you have a lot of people you made unemployed competing with you.
This is already what's happening in the video game industry. A ton of people have lost their jobs, and VC money has recently come pouring in trying to flip the displaced talent into the next big success.
And they'll probably do it. A number of the larger publishers are really struggling to succeed with titles that are bombing left and right as a result of poor executive oversight on attempted cash grabs to please the short term market.
Look at Ubisoft's 5-year stock price.
Short term is definitely not all that matters, and it's a rude awakening for those that think it's the case.
Mostly the execs don't care. They've extracted "value" in the form of money and got paid, that's the extent if their ability to look forward. The faster they make that happen the faster they can do it again, probably somewhere else. They don't give a single shit what happens after.
It really depends on the exec.
Like most people, there's a range.
Many are certainly unpleasant. But there's also ones that buck the trend.
Yeah, and there are a few good lawyers and a few good cops and (probably) a few good politicians too, but we're not talking about the few exceptions here.
Well, we kind of are as the shitty ones tend to fail after time and the good ones continue to succeed, so in a market that's much more competitive because of a force multiplier on labor unlike anything the world has seen there's not going to be much room for the crappy execs for very long.
Bad execs are like mosquitos. They thrive in stagnant waters, but as soon as things get moving they tend to reduce in number.
We've been in a fairly stagnant market since around 2008 for most things with no need for adaptation by large companies.
The large companies that went out of business recently have pretty much all been from financial mismanagement and not product/market fit like Circuit City or Blockbuster from the last time adaptation was needed with those failing to adapt going out of business.
The fatalism on Lemmy is fairly exhausting. The past decade shouldn't be used as a reference point for predicting the next decade. The factors playing into each couldn't be more different.
I just want to say I appreciate your informed opinions in contrast to the doom and gloomerism combined with class warfare that is so pervasive here.
Scaling up productivity is what tends to lead to layoffs. Having the exact same output but with fewer employees is pretty much guaranteed to lower cost and increase profit, so that's what most execs are likely to do. Short-sited maybe, but businesses are explicitly short-sited, only focusing on the next quarter.
How do you arrive at effective monopolies are going out the window, squaring it with what we see in the world today which runs counter.
There's diminishing returns on labor for large companies and an order of magnitude labor multiplier in the process of arriving.
For example, if you watched this past week's Jon Stewart, you saw an opening segment about the threat of AI taking people's jobs and then a great interview with the head of the FTC talking about how they try to go after monopolistic firms. One of the discussion points was that often when they go up against companies that can hire unlimited lawyers they'll be outmatched by 10:1.
So the FTC with 1,200 employees can only do so much, and the companies they go up against can hire up to the point of diminishing returns on more legal resources.
What do you think happens when AI capable of a 10x multipler in productivity at low cost is available for legal tasks? The large companies are already hiring to the point there's not much more benefit to more labor. But the FTC is trying to do as much as they can with a tenth the resources.
Across pretty much every industry companies or regulators a fraction of the size of effective monopolies are going to be able to go toe to toe with the big guys for deskwork over the coming years.
Blue collar bottlenecks and physical infrastructure (like Amazon warehouses and trucks) will remain a moat, but for everything else competition against Goliaths is about to get a major power up.
Can't wait for AI to replace all those useless execs and CEOs. It's not like they even do much anyways, except fondling their stocks. They could probably be automated by a markov chain
If they could replace project managers that would be nice. In theory it is an important job, but in practice it's just done by someone's mate who was most productive when they don't actually turn up.
The Paranoia RPG has a very realistic way of determining who gets to be the leader of a group. First, you pick who'll do what kind of job (electronics, brute force, etc). Whoever didn't get picked becomes the leader, as that person is too dumb to do anything useful.
Yes that's quite a funny and satirical way of doing it but it's probably not actually the best way in real life.
I think Boeing have proven this quite nicely for everyone, the company was much better off when they had actual engineers in charge. When they got corporate paper pushes everything went downhill.
I have been on enough projects where engineers were in charge that went to hell to know that isnt always a solution. And yes I am an engineer.
One of the projects I am on now the main lead is full PE civil and its a manmade clusterfuck well behind schedule, overbudget, and several corporate bridges burned. Haven't even started digging yet.
By far the very biggest cluster fuck I was ever on was run by a Chemical Engineer. A 40 million dollar disaster that never should have been even considered.
Being good at technical problems (which frankly most of us aren't) doesn't mean you know how to do anything else.
I have had good ones and not so good ones.
I swear people don't know the difference between a good project manager and a bad one, or no one.
Everyone on here is on about how the.board has no idea what the bottom rungs of the ladder do and are all "haha they are so stupid they think we do nothing". Then in the next sentence say they don't know what the board does and that they just do nothing.
Project managers on board members what the hell you want about
People slagging off jobs they don't understand.
Both project managers that they probably have experience with dealing with but don't understand and board members they probably don't have any experience with and also don't understand.
Board members don't do shit
I see.
What is this judgment based on?
First hand experience
Don't get a job in government contracting. Pretty much I do the work and around 5 people have suggestions. None of whom I can tell to fuck off directly.
Submit the drawing. Get asked to make a change to align with a spec. Point out that we took exception to the spec during bid. Get asked to make the change anyway. Make the change. Get asked to make another change by someone higher up the chain of five. Point out change will add delays and cost. Told to do it anyway. Make the next change....
Meanwhile every social scientist "we don't know what is causing cost disease"
if a manager says that instead of seeing the opportunity to reassign staff and expand, the manager needs to be replaced by AI immediately
Just become an AI consultant and triple your salary.
AI will (be a great excuse to) reduce workforce, say 41% of people who get bonuses if they do.
Game's changed. Now we fire people, try to rehire them for less money and if that doesn't work we demand policy changes and less labour protection to counter the "labour shortage".
Labor shortage is such a funny term. It's like coming to a store and looking for 1kg of meat for 1$, not finding it and saying there's meat shortage. Or coming to a vegetarian store and looking for 1kg of any meat and saying the same.
When everybody is employed, but the economy needs more people - that's labor shortage. When there are people looking for jobs, but not satisfied with particular offerings - that's something else.
If Gartner comes out with a decent AI model, you could replace over half of your CIOs, CISOs, CTOs, etc. Most of them lack any real leadership qualities and simply parrot what they're told/what they've read. They're their through nepotism.
Also, most of them use AI as a crutch, so that's all they know. Meanwhile, the rest of us use it as a tool (what it's meant to be).
Christ, if you think a CTO is hard to deal with, wait until you have to interface with the AI CTO.
As long as i can prompt-engineer my way into twice the salary for half the hours, that might still be worth it!
Yup. The owners can save a lot of money on those paychecks.
Won't tho.
That's exactly what an LLM is
But the AI can do it cheaper
But their job is to be the fall guy.
41% execs think that a huge amount of class power will go from workers in general to AI specialists (and probally the companies they make or that hire them).
I personally can't wait for a lot these businesses that bet on the wrong people to replace turn around and form new competition but with this new tech filling in the gaps of middle management, hr, execs, etc.
I mean its fucking meme, but an AI assisted workplace democracy seems alright to me on paper (the devils in details).
Execs don't give a shit. They simply double down on the false cause fallacy instead. They wouldn't ever admit they fucked up.
Last year the company I work for went through a run of redundancies, claiming AI and system improvements were the cause. Before this point we were growing (slowly) year on year. Just not growing fast enough for the shareholders.
They cut too deep, shit is falling apart, and we're loosing bids to competitors. Now they've doubled down on AI, claiming blindness to the systems issues they created, and just made an employee's "Can Do" attitude a performance goal.
Optimising for the oblivious or unscrupulous, nice.
You sound like you work from one of my part suppliers
Lets try it. I am willing to start a worker coop headed by votes and an AI. Fuck it.
59% of execs are wrong.
I think that's a little low.
They'll be replaced with AI
Thankfully I don't even wanna work. I just wanna live and if that's not possible, exist.
Same. I welcome our AI overlords as long as that means I can just stay at home and fully embrace my autism by not giving a fuck about the workforce while studying all of the thousands of subjects I enjoy learning about.
Not a thing til the revolution, dear.
I say AI overlords might be an improvement over the human overlords that have persisted throughout human history.
The AI overlords will be trained on data based on human overlords decisions and justifications. We are fucked, my man.
They won't be though because the managers don't know anything about AI. People who actually train the AI will be some poor sap in IT who's been lumbered with a job they don't want, because AI is computers right.
So I'm going to train it on good stuff written by professionals, Star Trek episodes, and make it watch War Games.
The managers don't even have any data sets the AI could absorb anyway because most of their BS is in person, and so not recorded for analysis.
Oh my. I see you don't know mich about the hell called key performance indicators...
Key performance indicators will be what will turn our AI overlords into AI tyrants. And there is so so much data available for training the AIs.
The autism is not required. No one cares about their jobs, especially people who work in jobs where "everyone is a family". People care about those jobs the least.
I will never care if AI takes mandatory work from me, but I want income replacement lol. Seriously though I hate working so much every job I've ever had has made me suicidal at some point. I'm glad there's a chance at least I won't have nothing but work and death ahead of me. If that's all that's left it's okay, a little disappointing but it is what it is.
Not allowed. Work or die, im afraid.
And that means lower prices for consumers. Right? Guys.. r.. right?
No, but it does mean 41%fewer people can afford to buy these companies products, you cheapass shortsighted corporate fucks.
41% is the number of executives that think AI will reduce their work force, not the number of jobs they expect to replace.
Your point stands though.
More businesses will be started to make the products since the profit margin is suddenly so high... driving down prices.
Execs? The same people who make short sighted decisions and don't understand basic psychology? Let me go get a pen so I won't...give two fucks what this bogus survey says. Let AI run your business so I can have some excitement in my life
They donβt care. Jack Welchβs ghost must be fed by destroying more companies for short term gain.
As someone scripting a lot for my department in the tech industry, yea AI and scripts have a lot of potential to reduce labor. However, given how chaotic this industry is, there will still need to be humans to take into account the variables that scripts and AI haven't been trained on (or are otherwise hard to predict). I know the managers don't wanna spend their time on these issues, as there's plenty more for them to deal with. When there's true AGI, that may be a different scenario, but time will tell.
Currently, we need to have some people in each department overseeing the automations of their area. This stuff mostly kills the super redundant data entry tasks that make me feel cross eyed by the end of my shift. I don't wanna be the embodiment of vlookup between pdfs and type the same number 4+ times.
exactly, this will eliminate some jobs, but anyone who's asked an LLM to fix code longer than 400 lines knows it often hurts more than it helps.
which is why it is best used as a tool to debug code, or write boilerplate functions.
Do you think AI for programmers will be like CAD was for drafters? It didnβt eliminate the position, but allows fewer people to do more work.
this is pretty much what i think, yeah.
a lot of programming/software design is already kinda that anyway. it's a bunch of people who were educated on computer science principles, data structures, mathematicians, and data analytics/stats who write code to specs to solve very specific tool problems for very specific subsets of workers, and who maintain/update legacy code written decades ago.
now, yeah, a lot things are coded from scratch, but even then, you're referencing libraries of code written by someone awhile ago to solve this problem or serve this purpose or do thing, output thing. that's where LLMs shine, imo.
No. More high-level languages with less abstraction leakage are like CAD for drafters. Not "AI".
I personally would want such tools to be more visual and more like systems, not algorithms.
Like interconnected nodes in a control system. Like PureData for music, or like LabView. Maybe more powerful and general-purpose.
Youβll get blindsided real quick. AIs are just getting better. OpenAI are already saying they moved past GPT for their next models. Itβs not 5 years before it can fix code longer than 400 lines, and not 20 before it can digest a specification and spout a working software. Said software might not be optimized or pretty, but those are things people can work separately. Where you needed 20 software engineers, youβll need 10, then 5, then 1-2.
You have more in common with the guy getting replaced today than you care to admit in your comment.
Edit: not sure why Iβm getting downvoted instead of having a discussion, but good luck to you all in your careers.
i didn't downvote you, regardless internet points don't matter.
you're not wrong, and i largely agree with what you've said, because i didn't actually say a lot of the things your comment assumes.
the most efficient way i can describe what i mean is this:
LLMs (this is NOT AI) can, and will, replace more and more of us. however, there will never, ever be a time where there will be no human overseeing it because we design software for humans (generally), not for machines. this requires integral human knowledge, assumptions, intuition, etc.
I disagree. When I was studying AI at college 20+ years ago we were also talking about expert systems which are glorified if/else chains. Most experts in the field agree that those systems can also be considered AI (not ML though).
You may be thinking of GAI or Universal AI which is different. I am a believer in the singularity (that a machine will be as creative and conscious as a human), but thatβs a matter of opinion.
I was using βyouβ more towards the people downvoting me, not you directly. You can see the accounts who downvoted/upvoted, btw.
Edit: and I assumed the implication of your comment was that βpeople who code are safeβ, which is a stretch I was answering to. Your comment was ambiguous either way.
jesus christ you should be shoved into a locker
Wow. Thanks for the advice. I guess thatβs just Lemmy showing me the door. Good luck with your community here.
Try not to let the bot hurt your feelings, it was trained on cunts 'n' assholes
It's an open secret that this is already the case. I have seen projects that went on for decades and only required the engineering staff they had because corporate bureaucracy and risk aversion makes everyone a fraction as effective as they could be, and, frankly, because a lot of ineffective morons got into software development because of the $$$ they could make.
Unless AI somehow eliminates corporate overhead I don't understand how it'll possibly make commercial development monumentally easier.
Scripting is one thing and unpredictable plagiarism generator is another.
If you mean ML text recognition, ML classification etc - then yeah, why not.
Hereβs a thought: letβs get rid of 41% of execs instead.
Let's get rid of corporate profits for shareholders. If your actually want to fix the problem. Make it illegal for shareholders to profit more than employees
People here keep belittling AI. You're all wrong, at least when considering the long run... We can't beat it. We need to outlaw it.
Train it to replace CEO's.
It's SchrΓΆdinger's AI. It is both useless and will replace everyone. Depending on the agenda the particular person is trying to push.
Oh, there it goes again.
I know it's getting boring. I am tried of people telling me how chatgpt and friends are toys that just spit back website data and in the same comment telling me how they are basically angry gods ready to end the human race.
Fucking make up your mind!
"Smash the looms" is the wrong idea.
"Eat the rich" might have some merit though.
Yeah, donβt smash the looms, seize them. The ability to make labor easier and more efficient is a positive if we donβt allow it to be a means to impoverish the workers
Nah, I disagree on both counts.
Is the intent here to preserve jobs even if it's less productive? That's solving the wrong problem. Instead of banning it, we should be adapting to it. If AI is more efficient than people, the jobs people take should change.
I think there's a solid case that if something would devolve into rent-seeking because competition is unproductive, it should be provided as a public service. Do you need a job if all of your basic needs are met by AI? At that point, any work you do would be optional, so people would follow their passions instead of working to make ends meet (see: Star Trek universe).
Think of it like Basic Income, but instead of cash, you'd get services at-cost. I think there's room for non-profits (or maybe the government) to provide these AI-services at-cost.
Outlawing it is a very dangerous aim, because outlawing it completely will enable other countries to out-compete us, and a outlawing it completely is right next to "outlaw it for normal people, but allow companies to exploit it for profit" on the dart board of possibilities.
Better path all around is "allow everyone to use AI and establish strong social safety nets and move towards enabling people to work less".
Haven't I been hearing that since the rise of computing and the internet? And it's probably been around even longer. Seems like this sort of stuff only gets going when a lot of workers start putting up a fight.
But hey, maybe 41% jobs lost might be the tipping point. Because people aren't just gonna sit on the sidewalk and starve.
If AI is outlawed, only outlaws will have AI
Y'all are dumbass doomers. Have some fun with AI while your can you some aged peasants. We were always fucked.
There is no denial a.i. is going to replace or significantly reduce some jobs. But I predict it's going to happen mostly in bullshit job like marketing, advertisement, the kind of journalism that repeat the same news from other reputed newspaper.
A.i. isn't going to replace the migrants that lay bricks in front on me, it's not going to replace their chief.
Itβll reduce the workforce from well-remunerated professionals who perform tasks to a larger number of disposable minimum-wage labourers who clean up botshit.
Pretty sure the entire Republican party and the ruling class they serve just orgasmed at that thought.
And aggregate demand needed to buy the shit they produce. But that's not this corpo's problem. Not until most corpos are doing it.
AI will remove 41% of execs, say 100% of people who know what AI is.
What's really interesting this time around is AI will cut middle management and paper pushers. Those are typically very good middle class jobs.
Unlike manufacturing, those people really don't have transferable skills. They can't go become mechanics or plumbers.
AI is going to hurt.
The jobs AI would be best at eliminating are HR and management. Instead, corpos give these shitters the power to eliminate other positions and then they still act like HR and management are the people producing value.
This is the risk and it has happened before.
The AI won't do my job exactly, but managers mostly manage, i.e. deal with organisational overhead. That Excel you've been maintaining for the past decade was never as crucial to the business' success as you made it appear. It was something the higher ups liked to talk about with pretty charts. An UI can generate other things to talk about from the same data.
I don't agree that those people don't have transferable skills, but I agree that's going to hurt. Like flattening hierachies, self-organised teams and outsourcing, previously cushy jobs will be replaced with more stressful ones.
You used to have a secretary to make calls for you and organise your calender. Now you have copilot and customers call you directly.
You don't need powerful AI or anything for this to happen. They just stopped hiring secretarial staff when managers learned how to use a computer.
I always ask myself who will buy the products these companies produce if all the workers have been fired. Maybe inflation is just the natural ramp up to McDonald's charging 5,000 dollars for automated chicken nuggets when there are only billionaire left with money lol.
When it's cheaper to make the products because you don't have to pay anyone, people will look at that manufacturer and think... wow I can start a business like that and make an easy profit? Competition will drive down prices.
I think their point is that when everyone's income is $0/hr price becomes pretty much irrelevant (unless also $0)
If they gain decent market share, they will be bought by one of the two or three companies that owns the entirety of that manufacturing category. If they don't, the incumbents will lower prices until the new thing is out of business. In either case, the prices bounce back, and even increase because of "inflation."
Always be competition if the profit margin is large
Economics of scale. Doesn't work like that.
Lowering prices to kill competition is kinda illegal, but due to, again, economics of scale they have to lower those to illegal levels. One customer interaction in average brings more profit and less expense with larger scale.
Economics of scale exactly. An industry that is easy to get into(thanks to ai) that has high profit margins will attract competition just like always.
I don't know what you mean by that, it doesn't make anything easier yet and there's no reason to think it will.
It absolutely makes it easier. It can answer you questions and present the same info large corporations used to have sole access to.
Example would be starting an entertainment content provider. You no longer need artists or writers. It can be a one person show instead of a team of 6.
Architecture firm...fewer designers. Fewer lawyers need hired.
Advertising firm. Fewer employees for sure. More avenues explored than ever possible before.
No, it's a glorified googling and plagiarism machine.
You can't trust its answers on what you don't already know, so can't check. Which makes it useless for that, because what you already know, you already do.
Firm "no" for both cases. Replacing expertise is something they don't do.
The quality of that content, though, would be such that people are going to pay for tools to avoid seeing it. Not even talking about litigation for plagiarism (which I'm ideologically against, so).
It can be used to assist existing artists and people of other occupations, surely. Or to be used in classification or text and voice recognition. Or even in data compression.
But it's not a revolutionary tool and it's not going to become one. Same as blockchain, a wasteful frontal assault way at imitating something wonderful which it isn't.
Yes, that's possible. And other such trash.
You just don't know what you are talking about. These are not quirks or bugs to be ironed out, these are inherent traits of such a machine.
There were "machine servants" in Antiquity which would pour wine. And there were analog toys which would recognize simple voice commands and react in 70s and 80s (you can find lots of wonders to make in journals for kids interested in radioelectronics from that time ; also if "inventions" from Heinlein's "Door into Summer" seem naive to you when reading it now, I hope it's interesting to know that they actually were very plausible even back then). This is just a very complex version of the same.
You have such a low opinion of a technology that, still in its embryonic stage, has already found its way into a large percentage of corporations that almost never adopt a tech this quickly. Already causing layoffs after so little time. Give it only a few more years.
Edit: not sure what you mean by machine servents. That tech has nothing to do with how ai is affecting the world right now.
You should try reading. I don't care about your opinions, but I could care about technical arguments if there would be any
You cared enough to talk with me over 3 days
There's one missing piece here, and it's startup capital. You don't usually see new chemicals manufacturers for instance, because you need a lot of money to buy everything to start with.
Not only that, also for paying royalties and money for legal protection when somebody sues you just to dig you in, or for consulting lawyers so that nobody would sue you.
There are lots of problems with running a glamorous official business. Mom&pop shop - yeah, you can.
Large existing corporations will expand the industries they are involved in and take advantage of the ease of entry with AI. The have plenty of capital and won't sit on their hands when golden opportunities are a few purchases away.
41% of execs maybe
Must be nice to just throw "A" in front of all your half-baked personal plans to instantly justify them.
Surely, this canβt and wonβt backfireβ¦ /s
I never had the impression that there were enough people for the amount of work anyways. I don't see jobs go, but shift. Most developers will be fine, because of never ending work, AI is just a tool speeding things up. But not that much, as someone who is good with Google and git, is just a bit slower to find the same answers. And AI needs verification too, even if it links you directly to the issue at hand, via source url.
AI will create new issues. Some of the low level requirement jobs will go, like working in first level support, but only if you learn the AI yourself, else it's too generic. We're not there yet, where companies learn their own LLM yet. some outlier try.
We got to understand that there's still a human layer and a lot of people might prefer calling a human, even if the result is worse, simply because we're social beings. This can cost a lot of customers, if companies believe they can just shove an AI in front.
No one really knows how good AI will get. As the technology advances, we find more and more hard to solve issues, for instance that AI will make things up or gives wrong answers, despite knowing the real answer, if you pressure hard enough.
Also for security reasons you can't add AI everywhere, unless you want to send all secrets directly to Microsoft, Google or Facebook.
My 5 cents.
Missing the point.
AI won't so much replace labor as make it more fungible, and thus exploitable/abusable.
Except where its used as an excuse to just... Not. "Yes we have customer service; its just all chatgpt with no permissions" so nobody can ever return shit that was delivered broken.
After reading this article that got posted on Lemmy a few days ago, I honestly think we're approaching the soft cap for how good LLMs can get. Improving on the current state of the art would require feeding it more data, but that's not really feasible. We've already scraped pretty much the entire internet to get to where we are now, and it's nigh-impossible to manually curate a higher-quality dataset because of the sheer scale of the task involved.
We also can't ask AI to curate its own dataset, because that runs into model collapse issues. Even if we don't have AI explicitly curate its own dataset, it's highly likely going to be a problem in the near future with the tide of AI-generated spam. I have a feeling that companies like Reddit signing licensing deals with AI companies are going to find that they mostly want data from 2022 and earlier, similar to manufacturers looking for low-background steel to make particle detectors.
We also can't just throw more processing power at it because current LLMs are already nearly cost-prohibitive in terms of processing power per query (it's just being masked by VC money subsidizing the cost). Even if cost wasn't an issue, we're also starting to approach hard limits in physics like waste heat in terms of how much faster we can run current technology.
So we already have a pretty good idea what the answer to "how good AI will get" is, and it's "not very." At best, it'll get a little more efficient with AI-specific chips, and some specially-trained models may provide some decent results. But as it stands, pretty much any organization that tries to use AI in any public-facing role (including merely using AI to write code that is exposed to the public) is just asking for bad publicity when the AI inevitably makes a glaringly obvious error. It's marginally better than the old memes about "I trained an AI on X episodes of this show and asked it to make a script," but not by much.
As it stands, I only see two outcomes: 1) OpenAI manages to come up with a breakthrough--something game-changing, like a technique that drastically increases the efficiency of current models so they can be run cheaply, or something entirely new that could feasibly be called AGI, 2) The AI companies hit a brick wall, and the flow of VC money gradually slows down, forcing the companies to raise prices and cut costs, resulting in a product that's even worse-performing and more expensive than what we have today. In the second case, the AI bubble will likely pop, and most people will abandon AI in general--the only people still using it at large will be the ones trying to push disinfo (either in politics or in Google rankings) along with the odd person playing with image generation.
In the meantime, what I'm most worried for are the people working for idiot CEOs who buy into the hype, but most of all I'm worried for artists doing professional graphic design or video production--they're going to have their lunch eaten by Stable Diffusion and Midjourney taking all the bread-and-butter logo design jobs that many artists rely on for their living. But hey, they can always do furry porn instead, I've heard that pays well~
I find the way that you write peculiar, in a good way. I mean no offense, but is English your secondary language?
Yeah, it's my second language. Sorry I wrote it a minute before bed, sometimes sentences become even weirder then. I went back and added some more commas. Haha
This is why not every business is successful I guess
π΅Dumb Dumb Dumb Dumb DumbπΆ
This is the best summary I could come up with:
A survey of senior biz executives reveals that 41 percent expect to have a smaller workforce in five years due to the implementation of AI technologies.
The research from staffing provider and recruitment agency Adecco Group found a "buy mindset" around AI, which "could exacerbate skills scarcity and create a two-speed workforce."
The figure is highest in Germany and France, where 49 percent of respondents say their company will employ fewer people in five years because of AI.
Seventy-eight percent of respondents say GenAI will play a "critical role in providing upskilling and development opportunities."
"While there is no denying that commercial interest in AI has been driven by its ability to reduce headcounts, the disruption will be a positive one β these industries have been suffering from decades-long skills crises, short on talent due to the high barriers to entry.
"Robotic engineers, data governors, drug discovery analysts β these are the jobs tomorrow that rely on AI," she told us.
The original article contains 438 words, the summary contains 161 words. Saved 63%. I'm a bot and I'm open source!
Imo when you make an industry easier for the managers/ceos using ai and have fewer workers, it will also be easier for people to create competition in that industry... driving down prices.
And as a result the remaining workforce will have to work more.
Eventually the entire economy will be just one overworked australian man.
Quiet, mate. Pulling the empty carts is the closest thing we get to sleep.
And he'll still be underpaid.
Oi did I hear my name, mate?
Who do these assholes think will buy their products and services when they put the entire workforce out of work? Do they plan to retreat to their bunkers and live out their days underground while the world burns above?
Can't wait for the 94% unemployed to raid the banks and eat the Bankers.
So, when am I gonna see some of all that UBI?
Hahahahahaha...
It's cheaper if we starve to death.
ITT: bunch of people who have no idea what AI even means
This is kind of like the early days of computers or internet all over again. LLMs is not what educated people mean when they're talking about AI. ChatGPT is not going to take your jobs, AGI will. Nobody just knows when. Might be next year or it might take 2 decades.
Enlighten us, oh knowledgeable one, what is AI?
β41% of execs display anomalous sexual prowess, 9β dongs thought to play a role.β
My money is on cyberdongs
If it isn't yet, AI will be calling the shots on the actual money owners (those big investment companies like Blackrock). Invest here, invest there, demand more from elsewhere. Said AI will then dictate who should be appointed CEO, director, etc, because it will be asked to name "a human" and little Timmy McMeritocracy, son of a high up elsewhere, needs his first job, nevermind that putting an AI in his place would be more profitable.
Bye bye middle management!
But seriously, work will always expand to the available workforce. That's why there are so many stupid industries. They always tank during a resession, but other industries will expand to use excess labor.