Running AI is so expensive that Amazon will probably charge you to use Alexa in future, says outgoing exec

L4sBot@lemmy.worldmod to Technology@lemmy.world – 559 points –
Running AI is so expensive that Amazon will probably charge you to use Alexa in future, says outgoing exec
businessinsider.com

Running AI is so expensive that Amazon will probably charge you to use Alexa in future, says outgoing exec::In an interview with Bloomberg, Dave Limp said that he "absolutely" believes that Amazon will soon start charging a subscription fee for Alexa

187

Alexa is so bad though. Who's going to pay for that?

“By the way, you can now pay for Alexa AI option if you want me to reply in a slightly smarter way, but I will still cut you off with ads and other useless things. To activate AlexaAI say activate”

"Welcome to the PiHole, Alexa."

Just made the switch to NextDNS. For $2/month I get a lot of the same features but also on my phone when not on WiFi. Still love my pihole though!

"No"

"I heard 'activate'. Thank you! Your credit card will be charged $129 annually. To cancel, please log on to the website because there's no way we're letting you get out of this mess the same way we got you into it."

To cancel, please log on to the website because there's no way we're letting you get out of this mess the same way we got you into it.

Unless you're in California

*to the same degree of intelligence as you've previously experienced. (Ps if you don't we're making Alexa have a room temp IQ)

Guess we'll find out when they finally pull the trigger

Mine can't ever seem to tell the difference between on and off if there is any sound in my house

Still better than Siri ...

Siri was always shit but somehow managed to devolve even further lately. I never trusted her to do more than than turning lights on or off but now this shit happens:

Me: Siri, turn off the lights in the living room

Siri: OKAY, WHICH ROOM? BATHROOM, BEDROOM, KITCHEN, HALLWAY, LIVING ROOM?

Imagine living in a mansion with this cunt

I use Google to turn on my TV by saying 'turn on TV', easily done. But then when I ask it to adjust volume it asks me which TV... I only have one TV online and it had just turned it on.

But he acknowledged that Alexa will need to drastically improve before that happens.

I get tired of the outrage-headline game.

We need to move AI from the cloud to our own hardware running in our homes. Free, open source, privacy focused hardware. It'll eventually be very affordable.

That's already here. Anyone can run AI chatbots similar to, but not as intelligent as, Chatgpt or Bard.

Llama.cpp and koboldcpp allow anyone to run models locally, even with only a CPU if there's no dedicated graphics card available (although more slowly). And there are numerous open source models available that can be trained for just about any task.

Hell, you can even run llama.cpp on Android phones.

This has all taken place in just the last year or so. In five to ten years, imo, AI will be everywhere and may even replace the need for mobile Internet connections in terms of looking up information.

Yes, and you can run a language model like Pygmalion Al locally on koboldcpp and have a naughty AI chat as well. Or non sexual roleplay

Absolutely and there are many, many models that have iterated on and surpassed Pygmalion as well as loads of uncensored models specifically tuned for erotic chat. Steamy role play is one of the driving forces behind the rapid development of the technology on lower powered, local machines.

Never underestimate human ingenuity

When they're horny

And where would one look for these sexy sexy AI models, so I can avoid them, of course...

Huggingface is where the models live. Anything that's uncensored (and preferably based on llama 2) should work.

Some popular suggestions at the moment might be HermesLimaRPL2 7B and MythomaxL2 13B for general roleplay that can easily include nsfw.

There are lots of talented people releasing models everyday tuned to assist with coding, translation, roleplay, general assistance (like chatgpt), writing, all kinds of things, really. Explore and try different models.

General rule: if you don't have a dedicated GPU, stick with 7B models. Otherwise, the bigger the better.

Which models do you think beat Pygmalion for erotic roleplay? Curious for research haha

Hey, I replied below to a different post with the same question, check it out.

Oh I see, sorry for the repeat question. Thanks!

lol nothing to be sorry about, I just wanted to make sure you saw it.

GPT4All is a neat way to run an AI chat bot on your local hardware.

Thanks for this, I haven't tried GPT4All.

Oobabooga is also very popular and relatively easy to run, but it's not my first choice, personally.

it does have a very funny name though

In five to ten years, imo, AI will be everywhere and may even replace the need for mobile Internet connections in terms of looking up information.

You're probably right, but I kinda hope you're wrong.

Why?

Call it paranoia if you want. Mainly I don't have faith in our economic system to deploy the technology in a way that doesn't eviscerate the working class.

Oh, you are 100% justified in that! It's terrifying, actually.

But what I am envisioning is using small, open source models installed on our phones that can answer questions or just keep us company. These would be completely private, controlled by the user only, and require no internet connection. We are already very close to this reality, local AI models can be run on Android phones, but the small AI "brains" that are best for phones are still pretty stupid (for now).

Of course, living in our current Capitalist Hellscape, it's hard not to imagine that going awry to the point where we'll all 'rent' AI from some asshole who spies on everything we do, censors the AI for our own 'protection', or puts ads in there somehow. But I guess I'm a dreamer.

Don’t these models require rather a lot of storage?

Storage is getting cheaper every day and the models are getting smaller with the same amount of data.

I’m just curious - do you know what kind of storage is required?

13B quantized models, generally the most popular for home computers with dedicated gpus, are between 6 and 10 gigs each. 7B models are between 3 and 6. So, no, not really?

It is relative so, I guess if you're comparing that to an atari 2600 cartridge then, yeah, it's hella huge. But you can store multiple models for the same storage cost as a single modern video game install.

Yeah that’s not a lot. I mean… the average consumer probably has 10GB free on their boot volume.

It is a lot to download. If we’re talking about ordinary consumers. Not unheard of though - some games on Steam are 50GB+

So okay, storage is not prohibitive.

God I wish, I would just love local voice control to turn my lights and such on and off... but noooooooooooo

I have home assistant, but have not heard anything good about rhasspy. Just want to control lights and be able to use it to play music and set timers. That being said I run home assistant right now and can control it with Alexa and Siri but.... I would like local only

I have that with just my phone, using Wiz lights and ITEEE. It's the only home automation I even have because it's the only one I found that doesn't necessarily need a special base station like an Alexa or Google Home.

But you want a local base station, else there’s no local control. You want to use local-only networks like z-wave, zigbee, Thread, Bluetooth, etc, even though they require a base station because that’s what gives you a local-only way of controlling things.

Matter promises a base station may no longer be necessary for smart devices to control each other, but it is rolling out very slowly

I also wonder what I’ll be able to do with the Thread radio in the iPhone 15 Pro

The base stations are what uses the cloud/AI shit. The setup I have doesn't even require an Internet connection or wifi; it's entirely bluetooth. Why in the hell would I want a base station that costs money, is controlled by Amazon or Google, and requires an Internet connection for my local shit?

I don't want a piece of hardware that does nothing but act like a fucking middleman for no good reason.

That is not necessarily true. Some base stations use the internet, yes, but not all. For example a Philips hue does not require internet access, nor does Lutron Caseta. As the other person posted, Home Assistant is the absolute best (IMO) way to do everything locally without the internet.

Your system, while it might work for you, does not scale well due to the limited range and reliability of Bluetooth. You'd likely be better off to adopt a more robust protocol like Z-wave, or ZigBee and get a hub that you have full control over.

1 more...
1 more...
1 more...
1 more...

It's the year of the voice for Home Assistant. Given their current trajectory, I'm hopeful they'll have a pretty darn good replacement for the most common use cases of Google Home/Alexa/Siri in another year. Setting timers, shopping list management, music streaming, doorbell/intercom management. If you're on the fence about a Nabu Casa subscription, pull the trigger as it helps them stay independent and not get bought out or destroyed by commercial interests.

Thumbs up for Nabu Casa and Home Assistant!

I haven’t yet played with the local voice stuff but have been following it with interest. Actually, now that Taspberry Piis are starting to become available again, I’m on the fence between buying a few more, vs finding something with a little more power, specifically for voice processing

Get something with a little more power. Pi's are reaching outside the price where they make sense these days. You can get an Intel N100 system on AliExpress/Amazon for pretty cheap now and I've got mine running ProxMox hosting all kinds of stuff.

I do wonder how much of those voice assistants could run on-device. Most of what I use Bixby for (I know. I KNOW.) is setting timers. I think simple things like that can run entirely on the phone. It's got a shocking amount of processing in it.

While you may have points against Apple and how effective Siri may be, with this latest version kind of products, even the watch has enough processing power to do voice processing on device. No ads. No cloud services

Pretty much. If you want a voice assistant right now, Siri is probably the best in terms of privacy. I bought a bunch of echos early, then they got a little shitty but I was in, and now I just want them out of my house except for one thing - music. Spotify integration makes for easy multi-room audio in a way that doesn't really work as well on the other platform that I'll consider (Apple/Siri) and basically adds sonos-like functionality for a tiny fraction of the price. The Siri balls and airplay are just not as good, and of course, don't work as well with Spotify.

But alexa is so fucking annoying that at this point I mostly just carry my phone (iPhone) and talk to that even though it's a little less convenient because I'm really goddamned tired of hearing "by the way..."

2 more...

AI is being touted as the solution to everything these days. It's really not, and we are going to find that out the hard way.

I get what you’re saying, but voice assistants are one of the main places LLMs belong.

Yes, but so much more. An actually useful assistant that could draft emails, set reminders appropriately, create automations, etc. would be worth A LOT of money to me.

I think if there ends up actually being a version of AI that is privacy focused and isn't screwing over creators it'd be so much less controversial. Also, everyone (including me) is really, really fucking sick of hearing about it all of the time in the same way that everyone is/was sick of hearing about the blockchain. As in: "Bro your taco stand needs AI/the blockchain."

You wouldn't need any kind of special training for this. Just the ability to do simple things like make calendar appointments, draft emails/responses, and set reminders based on time/locations/etc. It really doesn't seem very complicated but as far as I know no one has figured out how to do it yet. All the existing "assistants" are so bad that I don't even bother trying to use them anymore. They can't even do something simple like turning on a light with any degree of reliability.

Hey that's only because Amazon, Google and Microsoft (et al) just doesn't have the Money to Make it good!!

So what about 9.99 a month?

4.99 if you pay up front for a year?

Euh, or how much can you cough up, like for a year or at least for Q4, I'm literally on a bad roll here.

I'm not going to buy into a subscription model for something I've already paid for. This subscription model crap is complete bullshit.

We even tried to do it with heated seats recently. Like install heated seats in your car, but disable them in software. It's crazy that companies think they can get away with this.

I think there's a massive difference between unlocking a feature that's already there and requires no maintenance and a cloud-based service that demands 24/7 uptime and constant developer support, as well as ongoing feature development

While I agree with you, they are 💯 going to get away with it, because your average consumer just doesn't care.

If IBM actually manages to convert COBOL into Java like they're advertising, they'll end up killing their own cash cow

So much still runs on COBOL

Alexa is more like a telemarketer disguised as an assistant. Every interaction is followed by a “by the way . Its a shit experience so I stopped using mine.

Alexa was designed explicitly for that purpose. They lose money on every Echo sold, the whole idea was they would make money selling you stuff. Turns out people would rather use their Echo to check the weather, get recipes, etc. rather than voice shop.

I just can’t see a use case for voice shopping. There are almost zero instances where I want to buy something without having a visual of that thing in front of me at time of purchase.

I could possibly see something like “buy another stick of deodorant”, but even then I want to see if there are deals or some other options and would want to check the price at a minimum.

Seems like yet another MBA idea.

Yeah it seems the execs who had the idea for Alexa never used Amazon for shopping. It's a shit shopping site full of scammy products. I'd never buy anything from them without checking out the prices reviews, etc.

It's really only good for re-ordering things you've already ordered. It will let you know that it found something in your order history and then you can decide whether you want to order again.

And this makes sense, but I’d still want to check prices to make sure that my $3 deodorant didn’t get discontinued and priced at $30/stick.

Well you think this way because you've seen what happened to Amazon in the past 10 years. 10 years ago, when they were getting ready to launch the Echo, Amazon was a great retailer that people trusted. Now a decade of sellers gaming listings and reviews, and Amazon customer service deteriorating, we've been trained not to trust Amazon's defaults.

Ha, I use mine almost exclusively as a light switch. I don't have to get out of bed to turn off my lights or turn on my fan. I'm sure they're losing a bunch of money on me

It's a great lightswitch. Also, thermostat adjustor (my husband is very particular and changes it about a dozen times a day).

It's also good for setting a timer. But yeah, I'm not buying shit from it.

Setting all my Alexa's to UK English got rid of all marketing "by the ways." I still regret going with the Alexa ecosystem but at least for now there is a workaround for the most rage inducing part of it.

By the way, did you know that you can find out more about telemarketing with an audio book from audible on the subject. Would you like to hear a preview of that now?

So they expect that people pay for being spied upon and seriously data mined?

Yes and people will pay for it.

I don't know about that. They never delivered on Smart Home promises and the only truly useful thing my Google AI does is to give me the forecast. Otherwise it's just a wifi speaker.

If they finally integrate Bard, I would actually consider paying for the service.

We have a Google mini. They listen to my six year old request the song Poopy Bum Bum all day. The ads get interesting.

So they get massive amounts of free data for Machine Learning, but want to charge users for supplying it?

It's like charging you for cable and then shoving ads down your throat.

It's like charging you for Prime Video and then shoving ads down your throat.

It's like RAAAAIIAAIIIN on your wedding day

That's often the case. They can have their cake and eat it too. Shareholders would expect nothing less.

I think the data is probably less valuable than people think, especially if the users expect an AI response whenever a data point can be collected from them.

Alexa has a feature where you tell it you're leaving the house and it will listen for smoke detectors or breaking glass, alerting you through your phone if it detects something. Amazon is putting that behind a paywall next year.

Google did that with Nest Aware years ago as well. It's super annoying.

...charge me to use Alexa?

I already avoid it like the plague.

I have doubts alexa is using AI, seen how dumb it is, but well

From the article:

Amazon has bet big on AI, with the company unveiling a new, AI-powered version of Alexa alongside updated versions of its Echo Frames and Carrera smart glasses last week.

Good luck, I guess? Got the first Google home, at first it was great, I was asking it tons of questions. Then the questions stopped, used it for turning on the lights and other automations. Then I installed Home Assistant and the only command Google Home got was to set a timer to know when to pull things out of the oven. Eventually I stopped doing that.

At the moment all Google/Nest Homes have their mic cut off, I only use them to stream music in my house from my NAS via Plex. So yeah..

I still use mine for voice commands with home assistant. Works great.

All mine to is turn lights on and off.. very occasionally they might be used to find a phone, or set a reminder, but I wouldn't miss it if that went.

I wondered if I was unusual in not using the voice features much, but according to this thread it seems I'm not.

Something tells me that they'll still listen to you for free.

How to make me go back to buying shit in person, by Amazon.com

Their using the public to train AI. Then charging the public for the AI it trained.

Yes?

I'll happily join in bashing Amazon for plenty of reasons, but training an AI is a step that adds significant value to the material. Just like any other product that is an effort that people are willing to pay for.

1 more...

I use Alexa as a way to use an old speaker system. I wouldn't pay to use any "smart" speaker systems. They are pretty dumb and I've already paid once

They thought people would be like "Alexa, but me a ton of shit on Amazon" but people just use it for timers and the weather

Yeah, can you imagine trusting that thing to buy stuff for you? Bit scary, actually.

They had those "re-order product" physical buttons for a while which you were supposed to glue to your washing machine so you could reorder when your detergent ran out.

Besides legal issues (at least over here all they could do is put things in your shopping cart) apparently the primary customers of those buttons were hardware hackers, turning them into all kinds of stuff.

Ok. I'll be the weirdo. If it's actually useful, I would pay for it.

Not if it's just the parlor trick that it currently is.

This is the killer for all this shit right now as far as I'm concerned. All of it lives squarely in "huh...neat" territory. I have yet to see anything I felt was truly necessary. Until that happens, paying is a non starter for me.

This is why I'm so confused by Amazon's approach. I know they've already sunk millions if not billions of dollars into this, so why has the user experience not improved in the last 8 years?

I'm not going to buy things with my voice when just getting the lights to turn off or music to play can be an infuriating endeavor. Speech recognition has stagnated.

The third party integrations are just so clunky too. They could have made money by selling licenses to businesses in order to access the service, but again, they haven't improved that experience at all.

The "Alexa, let me talk to dominos." or "Alexa, ask LG to turn off the TV" is just stupidly cumbersome. Why can't you set up preferred providers? I don't have to say "ask Spotify to play music" I just say "play music", so we know it's possible. It would be trivial to implement other preferred service providers compared to the overall scale of Alexa.

I don't know if you're in IT at all, but the really crazy thing is that as half baked as Alexa stuff feels...a ton of AWS's offerings feel the exact same way. Their marketing material is great, and I do believe their engineers are passionate and have the right intentions. But none of it feels "finished". It all feels like an elaborate beta test. Things don't work, documentation is out of date or just plain wrong, it's impossible to get actual expert support from Amazon directly.

AWS is their biggest money maker and even that is a cobbled together, confusing pile half the time. Sometimes feels like everything is a house of cards.

It's weird to me that a company of this size is just that inept. It's like once they have enough momentum, nothing can stop them.

The same goes for Google, and to some degree MS.

It's really true. I'm actually annoyed that MS is starting to feel this way, particularly with some Azure related services. MS was always the one you could count on to at least be stable, well tested internally, and predictable. At least in comparison to Google and Amazon. But it feels like they have been leaving some of that behind with their cloud stuff as CI/CD becomes more prevalent.

Oh no!

I'll just have to install a weather app and use the timer on my stove instead of using Alexa.

Exactly. I never did find another use for that thing. Had one 2014-2022. It didn’t survive my last move. Was voted off the island.

We upgraded our technologies so much that it'a becoming unsustainable for us, so we are increasing prices.

They're never going to be satisfied with the amount of money that have.

That’s capitalism. Endless growth. Grew 1000%? Grow more. More. More. More.

In a real free market, all the banks that destroyed the economy through fraud wouldn't have gotten bailouts, they would've had to "pull themselves up by their bootstraps" like everyone else had to.

In a real free market, the banks would have gotten too big to fail and we would have bailed them out (ask me how I know this)

All I want alexa to do is turn my lights on and off, set timers, and show me my own pictures. And it can BARELY do that without fucking it up. Everyone I know wants the same, they expect nothing more from it. "AI" features of Alexa aren't t needed or wanted by anyone I've talked to about it.

My Home Assistant Voice is getting really close to displacing Alexa.

Same. I’ve already got an entire setup between gpt with customizable system level prompting capabilities and it uses custom voice models I’ve trained over at eleven labs

Now I just gotta slap my lil monsters phat ass into a raspberry pi and then destroy the fuck out of my Alexa devices and ship em to Jeff bozo

Can you share details? Been thinking of doing this with a new PC build. Curious what your performance and specs are.

You shouldn’t need anything really, all the components run via cloud services so you just need a network connection.

That’s why it’ll run just fine on a cheap pi model

Essentially the script in Python just sends api requests directly to OpenAI and returns the AI response. Next I just pass that response to the elevenlabs api and play that audio binary stream via any library that supports audio playback.

(That last bit is what I’ll have to toy around with on a pi but, I’m not worried about finding a suitable option, there’s lots of libraries out there)

Oh wait, I think I misunderstood. I thought you had local language models running on your computer. I have seen that be discussed before with varying results.

Last time I tried running my own model was in the early days of the Llama release and ran it on an RTX 3060. The speed of delivery was much slower than OpenAI's API and the material was way off.

It doesn't have to be perfect, but I'd like to do my own API calls from a remote device phoning home instead of OpenAI's servers. Using my own documents as a reference would be a plus to, just to keep my info private and still accessible by the LLM.

Didn't know about Elevenlabs. Checking them out soon.

Edit because writing is hard.

That could be fun! I’ve made and trained my own models prior but I find that getting the right amount of data (in terms of both size and diversity to ensure features are orthogonal out of the gate) can be pretty tough.

If you don’t get that right balance of size and diversity in your data, that efficacy upper limit is gonna be way lower than you’d like, but you might have some good data sets laying around I got no clue ^_^

Lemmy know how it goes!

Has Amazon considered making a stationary AI?

You just typed that question on one. See: GPT4All You can download many models and run them locally. They were about 5-16GB in size the last time I downloaded one. Pretty slow if you don't have a hefty GPU, but it works!

They won't be charging me, because I don't buy shit from Amazon, and don't use their spy platform.

Well so far my Google Smart speakers have two functions:

  • Voice activated timer
  • Wi-Fi speaker that I only ever cast to with my phone, never actually talk to them.

Don't think I'll miss that if they decide to charge for it.

My Google homes have gotten progressively worse over the years. Half the time it will say it's setting a timer but nope, no timer. Recently I'll tell it to play music and it will reply that I don't have any devices with that feature.... they're all Google homes or Chromecast which absolutely play music. Really like the hardware but the software is utter shit

I hate when it's playing music and I tell it to shut the fuck up, then it decides to turn off every actively going alarm in the house instead of turning off the goddamn music playing on the one it literally responded in. This happens most mornings.

They also removed the ability to link third-party list applications so now saying "Add X to shopping list" just sends it into some nether realm where the item is never to be seen ever again.

Running AI may be currently expensive, but the hardware will continue to improve and get cheaper. If they institute a subscription fee and people actually pay for it, they'll never remove that fee even after it becomes super cheap to run.

That is sort of the issue when mixing good conscience with capitalism. Either the goods are valued at what we're willing to pay, or either they're valued at what we think the profit margin of the business should be, but mixing the two ultimately leads us to fall for PR crap. Business are quick to gather sympathy when the margins are low, and we fall for this PR crap, but then as soon they own a part of the market it turns into raising the price as much as they possibly can.

That being said, Amazon became what it is because Bezos was hell bent on not rug pulling customers, at least in the early years, so it is possible they would decrease prices eventually to gain market advantage, that's their whole strategy.

but the hardware will continue to improve and get cheaper.

Eh. I mean sure the likes of A100s will invariably get cheaper because they're overpriced AF, but there isn't really that much engineering going into those things hardware-wise: Accelerating massive chains of fmas is quite a smaller challenge than designing a CPU or GPU. Meanwhile moore's law is -- well maybe not dead but a zombie. In the past advances in manufacturing meant lower price per transistor, that hasn't been true for a while now and the physics of everything aren't exactly getting easier, they're now battling quantum uncertainty in the lithography process itself.

Where there might still be significant headways to be made is by switching to analogue, but, eeeh. Reliability. Neural networks are rather robust against small perturbations but it's not like digital systems can't make use of that by reducing precision, and controlling precision is way harder in analoge. Everything is harder there, it's an arcane art.


tl;dr: Don't expect large leaps, especially not multiple. This isn't a naughts "buy a PC twice as fast at half the price two years later" kind of situation, AI accelerators are silicon like any other they already make use of the progress we made back then.

As someone at a company still using free AI credits in their commercial products and hasn't figured out how he's going to price the shit when the credits are up.... this AI market looks a lot like Uber subsidies..

Like a year or two from now, probably any AI stuff that isn't self hosted is going to be 100% inaccessible to normal people due to cost. It's just a question of how hard they're going to fight to keep current free to download LLM models off the internet once this happens.

That's a pretty accurate encompassing statement. Well done.

We're seeing this all over the tech and tech adjacent space. Can't grow forever at a loss, especially not with increased interest rates and a potential economic downturn.

My guess, if you want to have decent services we're going to end up needing to pick few (or a suite of the basics) to pay for on a monthly basic and cut out all the "free" stuff that is/will get enshittified.

in my eyes they put themselves in an awkward position by garnering a reputation of always collecting more user data than justified, and at this point i assume they do the same with paid products as it's an industry norm. however I'm not ok with it and will never pay when the product doesn't respect privacy. the saying used to be "if you don't pay, you're the product", but it is increasingly shifting to: you're the product and also you have to pay so that our shareholders can experience more infinite growth

I don't understand this. Hasn't Intel or Nvidia (or someone else) been making claims about their next CPUs having AI functionality built-in?

Having "AI functionality" doesn't mean they can just get rid of their big/expensive models they use now.

If they are anything like Open AI's LLM, it requires very beefy machines with a ton of expensive RAM.

Well that's exactly what I was thinking when these companies were making these claims... like HOW could they possibly handle this locally on a CPU or GPU when there must be a massive database that (I assume) is constantly being updated? Didn't make sense.

EDIT: this entire website can go fuck off. You ask a simple question about some reasonably new tech, and you get downvoted for having the audacity to learn some new stuff. People on here are downright pathetic.

"AI" doesn't use databases per se, they are trained models built from large amounts of training data.

Some models run fine on small devices (like the model running on phones to make better pictures) but others are huge like Open AI's LLM.

training data.

Wouldn't that data be stored in some kind of database?

No, the data will influence the model.

Some of the data may be found in the model itself (i.e. the AI generated images outputting mangled author signatures from the original works that were used during training) but not in the traditional form of a database. You can't directly retrieve that data back in its original form even if some models can be coerced to do something similar.

It's basically a statistical model built from training data.

The training of these huge models also cost a fortune. Allegedly in the millions of $ in a data and processing-intensive process.

The training data isn't stored in the model. You can take an existing model and fine tune it on a whole bunch of additional data and the model size won't change.

The other answers are a bit confusing...

Yes, that's in a database.

However, you can think of it like a large library of books on how to best tune a ukulele. There might be a lot of information to figure out how to tone the ukulele and a lot of skill to put all that knowledge to use, but the ukulele once tuned, is quite small and portable.

You're right. Run an llm locally adjacent to your application sandboxes and local user apps and your office will lower its heating bills.

You can record and edit videos on your own devices, but that doesn't mean it's suddenly free for Netflix or YouTube to stream their videos to you.

Surely a local version of Alexa could be developed, but that development would come with its own costs.
Some things simply can't be done locally, such as a web search. Often your route calculations for a map application are also done in the cloud.

Na, once ml inference and training chips are purpose built it'll be built into devices. AI models are the mainframes of today

I think at this point with so many tech giants introducing ads to their services and increasing subscription prices, I think we can expect some kind of subscription fee to access assistants with the AI/LLM capability. It would make sense to offer a 'basic' version of these services for free since people have already invested in the hardware, but wouldn't be surprised if these companies suddenly block us from using the smart functionality suddenly unless you pay.

This is the best summary I could come up with:


The emerging generation of "superhuman" AI models are so expensive to run that Amazon might charge you to use its Alexa assistant one day.

In an interview with Bloomberg, outgoing Amazon executive Dave Limp said that he "absolutely" believes that Amazon could start charging a subscription fee for Alexa, and pointed to the cost of training and running generative artificial intelligence models for the smart speaker's new AI features as the reason why.

Limp said that the company had not discussed what price it would charge for the subscription, adding that "the Alexa you know and love today is going to remain free" but that a future subscription-based version is "not years away."

Generative AI models require huge amounts of computing power, with analysts estimating that OpenAI's ChatGPT costs $700,000 a day or more to run.

Limp, Amazon's senior VP of devices and services, announced he would step down from his role at the company after 13 years a month before the launch of the new products.

Insider's Ashley Stewart reported that former Microsoft exec Panos Panay is expected to replace Limp.


The original article contains 298 words, the summary contains 182 words. Saved 39%. I'm a bot and I'm open source!

Let's see if they will make real money with their AI, now.

im actually surprised companies havent tried to charge for voice assistants already considering pretty much everything you say to them gets sent to some service somewhere

Yeah... The moment they do that is the moment I turn off and disconnect every smart speaker I own, take them to the electronics recycling place, and start building out an open-source smart home setup.

I use commercial smart speakers because they're easy and cheap. The moment they stop being one or the other is the moment they stop being in my home.

and start building out an open-source smart home setup.

Why are you waiting? Home Assistant is here now and it works pretty damn well for running a smart home.

At this point all Alexa is / does is act as a voice enabler for my HA setup and even that will be going away soon. They're working hard to have a localized VA of their own. It's actually usable NOW it just doesn't have "Wake Word" support yet so you have to PTT (Push To Talk) on something for HA to know you want to talk to it.

I shut mine off a while back when I was sure they were advertising stuff based on things we were talking about in the same room. We were discussing moving the chairs out of the office and the next time we went to play music she wanted to sell us new ones.

It might be a total coincidence but screw that.

1 more...
1 more...

They could be handing those out for free and they'd still rot on the shelves. People just don't know what to do with them and I am not certain Amazon does.

Charge for them? That's meta verse idea quality.

They were pretty much were giving away google home minis away in Canada and I still decided against it. See them at thrift stores all the time.

1 more...

I no longer use mine for much but to control one light with an illogical switch placement. I could pretty much replace her with The Clapper™️