Tech vendors have also been falling over each other to tell the world how they are including GenAI in their offerings as the leading AI companies attract feverish attention from investors.
Because you can't hype it up for investors if you call it what it actually is. Fancy auto complete. And don't get me wrong, I love me some of the tools out there. But this stuff is being absolutely way over hyped.
It's good to go into this stuff with realistic views. Will it do all your work? Absolutely not. But what it will do is do a lot of heavy lifting for you so that you can get more things that require your specific attention done.
The level of "sky is falling and we're all going to be enslaved by AI" is literal bullshit to sell more stocks and create a bubble that will absolutely pop.
Exactly, they're tools like any other and like any other tool it's not going to do the whole job for you and you're going to need to learn how to use it well to get the most out of it
It's neither magic, like the AI/tech bros would like you to think nor is it the harbinger of doom and some evil thing that need to be squashed like the anti-AI bros want you to think
Anecdotally, with Gen AI I've been able to get 30 billable hours of work done in about 12 this week. I had to break down a detailed 320 page document. The thing is, I am good enough at my job that I can do that on my own. The difference with AI is that the final product is neater, and I'm not as mentally drained/carpal tunneled afterwards. Bottom up automation, for the worker's benefit only, is the only kind I like.
I've seen a junior using chatGPT to do the job while not really understanding what's going on and the end it was a big mess that didn't work. After I told him to read a "for dummies" book and he started to think for himself he got something decent out of it. It's no replacement for skill and thinking.
exactly what I expected. It only will be worse. Since those juniors don't know what is good or wrong code for example. So they just assume whatever ChatGPT is saying is correct. They have no benchmark to compare.
Had a very similar experience in pretty niche-use cases. LLMs are great if you understand the what you are dealing with, but they are no magical automation tool (at least in somewhat niche, semi-technical use cases where seemingly small errors can have serious consequences).
That's been my experience so far, that it's largely useless for knowledge based stuff.
In programming, you can have it take "pseducode" and have it output actionable code for more tedious languages, but you have to audit it. Ultimately I find traditional autocompletion just as useful.
I definitely see how it helps cheat on homework, and extends "stock photography" to the point of really limiting the market for me photography or artists for bland business assets though.
I see how people find it useful for their "professional" communications, but I hate it because people that used to be nice and to the point are staying to explode their communication into a big LLM mess.
Because you can't hype it up for investors if you call it what it actually is. Fancy auto complete. And don't get me wrong, I love me some of the tools out there. But this stuff is being absolutely way over hyped.
It's good to go into this stuff with realistic views. Will it do all your work? Absolutely not. But what it will do is do a lot of heavy lifting for you so that you can get more things that require your specific attention done.
The level of "sky is falling and we're all going to be enslaved by AI" is literal bullshit to sell more stocks and create a bubble that will absolutely pop.
Exactly, they're tools like any other and like any other tool it's not going to do the whole job for you and you're going to need to learn how to use it well to get the most out of it
It's neither magic, like the AI/tech bros would like you to think nor is it the harbinger of doom and some evil thing that need to be squashed like the anti-AI bros want you to think
Anecdotally, with Gen AI I've been able to get 30 billable hours of work done in about 12 this week. I had to break down a detailed 320 page document. The thing is, I am good enough at my job that I can do that on my own. The difference with AI is that the final product is neater, and I'm not as mentally drained/carpal tunneled afterwards. Bottom up automation, for the worker's benefit only, is the only kind I like.
I've seen a junior using chatGPT to do the job while not really understanding what's going on and the end it was a big mess that didn't work. After I told him to read a "for dummies" book and he started to think for himself he got something decent out of it. It's no replacement for skill and thinking.
exactly what I expected. It only will be worse. Since those juniors don't know what is good or wrong code for example. So they just assume whatever ChatGPT is saying is correct. They have no benchmark to compare.
Had a very similar experience in pretty niche-use cases. LLMs are great if you understand the what you are dealing with, but they are no magical automation tool (at least in somewhat niche, semi-technical use cases where seemingly small errors can have serious consequences).
That's been my experience so far, that it's largely useless for knowledge based stuff.
In programming, you can have it take "pseducode" and have it output actionable code for more tedious languages, but you have to audit it. Ultimately I find traditional autocompletion just as useful.
I definitely see how it helps cheat on homework, and extends "stock photography" to the point of really limiting the market for me photography or artists for bland business assets though.
I see how people find it useful for their "professional" communications, but I hate it because people that used to be nice and to the point are staying to explode their communication into a big LLM mess.