Generative AI’s environmental costs are soaring — and mostly secret

flango@lemmy.eco.br to News@lemmy.world – 241 points –
Generative AI’s environmental costs are soaring — and mostly secret
nature.com

one assessment suggests that ChatGPT, the chatbot created by OpenAI in San Francisco, California, is already consuming the energy of 33,000 homes. It’s estimated that a search driven by generative AI uses four to five times the energy of a conventional web search. Within years, large AI systems are likely to need as much energy as entire nations.

108

You are viewing a single comment

Tbf, talking about the environmental costs of generative AI is just framing.
The issue is the environmental cost of electricity, no matter what it is used for.
If we want this to be considered in consumption then it needs to be part of the electricity price. And of course all other power sources, like combustion motors, need to also price in external costs.

It should be considered, an ultra-light laptop, or a raspberry pi consume WAY less power than a full on gaming rig, and the same can be said between a data server that is used for e-commerce and a server running AI, the AI server has higher power requirements (it may not be as wide of a margin from my first comparison, but there is one), now multiply that AI server and add hundreds more and you start seeing a considerable uptick in power usage.

an ultra-light laptop, or a raspberry pi consume WAY less power than a full on gaming rig, and the same can be said between a data server that is used for e-commerce and a server running AI

And if external costs are priced into the cost of electricity then that will be reflected in the cost of operating these devices.
Also there are far more data servers than servers running AI which increases the total effect they have.