etrotta

@etrotta@beehaw.org
0 Post – 5 Comments
Joined 1 years ago

To be fair, I wouldn't include "loading the whole model into VRAM" as part of the cost, given they can just keep it in there between different requests, and it might be down to hundreds of billions or dozens of billions instead of trillions.... but even after all improvements it should still be orders of magnitude more expensive than normal search, which just makes their decision even crazier

The vast majority of consumer devices, both mobile and laptops/desktops, are not powerful enough to run local AI with a good user experience yet, and even if they were, a lot of users would still prefer having it run in the cloud rather than using up their phone battery

It is pretty much unplayable without the bank storage upgrades though. Given the amount of time they can save you, you may as well say that they are power.

It still has more upsides than your average X as a service. Beyond what the OP already said, it lets you jump right into a game without having to wait for it to download, which is pretty big when games regularly take >100GB of space, even more so if you like to switch between different games often.

If you try a game and decide you don't like it, you may as well end up using up less bandwidth than if you had downloaded, not to mention zero waiting time (unless there's a queue, but usually there isn't assuming that you have an actual paid subscription and it isn't an absurd peak demand).

Saying that Stable Diffusion was trained by "individuals" is a bit of a stretch, it cost over half a million dollars worth of compute to train it, and Stability AI is still a company in the end of the day. If that still counts as trained by individuals, then so does Midjourney and Dalle