Back

AI's energy footprint

201 points1 daytechnologyreview.com
BewareTheYiga24 hours ago
panstromek1 hour ago

There's a certain irony here in the fact that this page is maxing out my CPU on idle, doing some unclear work in javascript, while I'm just reading text.

SoftTalker14 hours ago

If you are old enough you remember posting to Usenet and the warning that would accompany each new submission:

This program posts news to thousands of machines throughout the entire civilized world. Your message will cost the net hundreds if not thousands of dollars to send everywhere. Please be sure you know what you are doing. Are you absolutely sure that you want to do this? [ny]

Maybe we meed something similar in LLM clients. Could be phrased in terms of how many pounds of atmospheric carbon the request will produce.

smusamashah1 hour ago

Your quote is actually telling the opposite of your suggestion.

So they used to send this message, but then it stopped I assume. Costs lowered a lot or the benefits outweighed all associated costs. Same can happen here.

yongjik14 hours ago

A lot of us live in a country where "rolling coal" is a thing. I fear your prompt may have an opposite of the intended effect.

UnreachableCode4 hours ago

You say that as if rolling coal people are capable of using or understanding LLMs

lvturner4 hours ago

While I might question their sanity and/or ethics, it's generally not a good idea to underestimate a fool.

bravetraveler4 hours ago

You say that as if LLMs aren't another dumbing-down interface.

01HNNWZ0MV43FF13 hours ago

Taxing anything that can pollute (methane, gasoline, diesel) would let The Hand sort it out

freeone300012 hours ago

Carbon taxes are incredibly unpopular, because it makes easy and convenient things expensive.

+2
triceratops12 hours ago
+1
conradev5 hours ago
blkhawk13 hours ago

I find that message very curious because the message itself clearly does not cost much but the machines it is send on do. So the more messages that are send the less the messages will cost.

anon70003 hours ago

Well, Usenet is 45 years old, and the internet was not nearly as cheap and ubiquitous then

wmf14 hours ago

Only if that same warning is attached to literally everything else you do. It's unfair to single out AI for using energy.

SoftTalker14 hours ago

But "asking ChatGPT" is something people do so casually without there being any apparent clues of the costs associated with that. I guess that is true of pretty much everything online though.

Even driving your car around you at least are somewhat aware of the gas you are burning.

wmf13 hours ago

"Civilization advances by extending the number of important operations which we can perform without thinking of them." — Alfred North Whitehead

Emissions should be fixed on the production side (decarbonization) not on the demand side (guilt/austerity).

+3
mulmen10 hours ago
sanktanglia5 hours ago

People spend hours a day mindlessly scrolling social media apps(streaming video calls to boot) that also take up water and energy usage per hour but are totally disconnected from it

appreciatorBus14 hours ago

I would bet most ppl drive around with very little awareness of how much it’s costing, either in money or environmental impact. Many people I’ve met seem to measure efficiency by how much it costs to fill up the tank.

+2
bilbo0s12 hours ago
asdff14 hours ago

That could be solved by charging more for the service. That is the only reason you are aware of the gas burning after all, you aren't conducting your own aq test you are noticing you are filling up twice a week at $50 a tank.

phillipcarter13 hours ago

They're aware of the price they pay for the gas, not the emissions. I would wager that the mass ignorance of the impact of fossil fuels (and rubber on roads) that the broader population has is a significant reason why we're in this climate mess today.

+2
jbm13 hours ago
keybored13 hours ago

You would wager. Based on what?

There’s been decades of lies about climate change. And once the truth got out society was already massively dependent on it. For cars specifically it was a deliberate policy to make e.g. the US car-dependent. And once the truth got undeniable the cope was switched to people’s “carbon footprint” (British Petroleum). In fact there are rumors that the cope echoes to this day.

Zoom out enough and it becomes obviously unproductive to make “mass ignorance” the locus of attention.

paulcole7 hours ago

Obviously people have zero awareness of or interest in their true impact on the environment. This extends to every facet of life and is not at all limited to AI use.

Do you really think the average person could within 2 orders of magnitude when estimating their carbon footprint for a year?

reaperducer12 hours ago

It's unfair to single out AI for using energy.

Why? AI isn't a human being. We have no obligation to be "fair" to it.

thot_experiment12 hours ago

Yeah we do, it's basic epistemic hygiene. If you don't freak out about running your shower or microwave for a couple seconds or driving a few hundred feet you shouldn't be concerned about prompting an AI.

+1
const_cast9 hours ago
reaperducer11 hours ago

Apologizing for AI boiling the oceans sounds like a lot of whataboutism.

I can picture an Elizabeth Holmesian cartoon clutching her diamond necklace.

"Oh, won't somebody think of the tech billionaires?!"

If you don't freak out about running your shower or microwave for a couple seconds or driving a few hundred feet

The basic premise of the modern tech industry is scale. It's not one person running a microwave for a couple of seconds, it's a few billion people running a microwave for the equivalent of decades.

GuinansEyebrows13 hours ago

fairness to one polluter over another isn't the real issue - look at prop 65 in california; or if you're not used to this in CA, think of any time you've been on-call. alert fatigue is real and diminishes the urgency of the underlying message.

keybored13 hours ago

You don’t need it for any pragmatic benefit because it won’t work. It doesn’t work for eating meat. It won’t work for AI.

The only purpose is to scapegoat the possible environmental or economic fallout. Might as well put it on individuals. Like what’s always done.

I’ve already seen it on the national broadcast. There some supposed experts were wagging their fingers about using AI for “fun”. Making silly images.

Meanwhile we’re gonna put AI to good use in arms races: more spam (automated applications, ads, ads, ads, abuse of services) and anti-spam. There’s gonna be so much economic activity. Disruptive.

kmeisthax13 hours ago

Individual LLM requests are vanishingly small in terms of environmental impact; inference providers use a lot of batching to do lots of work at once. Furthermore, LLMs and diffusion models are not the only ML workload. While generative AI tickles investors, most of the ML actually being deployed is more mundane things, like recommendation systems, classifiers, and the like; much of which is used for adtech purposes adversarial to that of users. If LLMs and diffusers were the only thing companies used ML for, but efficiency gains from new hardware remained constant, we'd still be at the 2017 baseline for environmental impact of data centers.

Likewise, I doubt that USENET warning was ever true beyond the first few years of the networks' lifetime. Certainly if everything was connected via dial-up, yes, a single message could incur hundreds of dollars of cost when you added the few seconds of line time it took to send up across the whole world. But that's accounting for a lot of Ma Bell markup. Most connections between sites and ISPs on USENET were done through private lines that ran at far faster speeds than what you could shove down copper phone wiring back then.

cj11 hours ago

If what you're saying is true, why are we hearing about AI companies wanting to build nuclear power plants to power new data centers they think they need to build?

Are you saying all of that new capacity is needed to power non-LLM stuff like classifiers, adtech, etc? That seems unlikely.

Had you said that inference costs are tiny compared to the upfront cost of training the base model, I might have believed it. But even that isn't accurate -- there's a big upfront energy cost to train a model, but once it becomes popular like GPT-4, the inference energy cost over time is dramatically higher than the upfront training cost.

You mentioned batch computing as well, but how does that fit into the picture? I don't see how batching would reduce energy use. Does "doing lots of work at once" somehow reduce the total work / total energy expended?

lkbm11 hours ago

> If what you're saying is true, why are we hearing about AI companies wanting to build nuclear power plants to power new data centers they think they need to build?

Well, partly because they (all but X, IIRC) have commitments to shift to carbon-neutral energy.

But also, from the article:

> ChatGPT is now estimated to be the fifth-most visited website in the world

That's ChatGPT today. They're looking ahead to 100x-ing (or 1,000,000x-ing) the usage as AI replaces more and more existing work.

I can run Llama 3 on my laptop, and we can measure the energy usage of my laptop--it maxes out at around 0.1 toasters. o3 is presumably a bit more energy intensive, but the reason it's using a lot of power is the >100MM daily users, not that a single user uses a lot of energy for a simple chat.

protocolture9 hours ago

>If what you're saying is true, why are we hearing about AI companies wanting to build nuclear power plants to power new data centers they think they need to build?

Something to temper this, lots of these AI datacenter projects are being cancelled or put on hiatus because the demand isnt there.

But if someone wants to build a nuke reactor to power their datacenter, awesome. No downsides? We are concerned about energy consumption only because of its impact on the earth in terms of carbon footprint. If its nuclear, the problem has already been solved.

+1
fakedang7 hours ago
FreezerburnV9 hours ago

Because training costs are sky-high, and handling an individual request still uses a decent amount of energy even if it isn't as horrifying as training. Plus the amount of requests, and content in them, is going up with stuff like vibe coding.

If you want to know more about energy consumption, see this 2 part series that goes into tons of nitty-gritty details: https://blog.giovanh.com/blog/2024/08/18/is-ai-eating-all-th...

+1
davidcbc7 hours ago
adrianN1 day ago

The important part remains internalizing emission costs into the price of electricity. Fussing over individual users seems like a distraction to me. Rapid decarbonization of electricity is necessary regardless of who uses it. Demand will soar anyway as we electrify transportation, heating, and industry.

seb120424 hours ago

I agree but reducing consumption or increase of efficiency are still very important aspects of the energy transition. What is not consumed does not need to be generated.

nostrademons14 hours ago

If you internalize emissions costs into the price of electricity, reduced consumption will happen naturally. Precisely nobody likes higher energy bills, so there's a natural incentive to reduce consumption as long as you're paying for it.

asdff12 hours ago

I wonder how much households can really save here. Most "luxury" items using electricity don't really use much e.g. a modern laptop or modern smartphone. The stuff that does use a lot of electricity are things like your AC unit or your electric heater and electric stove. Seems there is little wiggle room there to me, people might end up just getting saddled with higher bills especially if slightly more efficient home appliances are out of reach (or not purchased by the renter at all). And for people who might get strongly affected out of their budget by these things for lack of income there are usually subsidies to help pay for their energy usage, which might further stymie market forces from changing behavior. Seems most high energy use consumers are high enough income where they won't be much affected by increased power costs like how we see them unaffected by water restrictions and higher fees for high water usage already.

Maybe that says the fees aren't yet high enough for high income people to change behavior, but I'm willing to bet they never truly will be due to the influence this subset of the population holds over politics.

bee_rider6 hours ago

Carbon taxes could be phased in over time, to give people a chance to make that decision over the course of natural appliances update lifecycles.

Even if rich people don’t consume much more energy than poor people (I have no idea, just engaging with your idea as stated), they must be buying something with their money… carbon taxes should raise the price of goods with lots of embodied carbon.

If they aren’t consuming much energy and am they aren’t buying stuff with much embodied carbon… I dunno, I guess that’s the goal, right?

wmf7 hours ago

It's not about households anyway, it's about transportation and industrial usage. Larger companies have enough scale that they can afford to invest in efficiency.

_aavaa_10 hours ago

Some of these would benefit from changes (e.g. electric heating -> heat pump). Others would be better off with other changes. E.g. too much cooling? Consider better awnings, stronger blinds, or even IR rejecting films.

As for the stove, how much it uses is directly related to the kind of cooking you do, and for how long.

designerarvid6 hours ago

It would incentivise energy production that avoids those costs.

delusional13 hours ago

"internalizing emissions" is the kind of thing that's really easy to say, even conceptualize, but really difficult to implement.

You could do it better than we are doing now, but you'll always have people saying: "that's unfair, why are you picking on me"

+1
_aavaa_10 hours ago
0_____021 hours ago

Sometimes I see chatter about using solar or nuclear or whatever power for data centers, thereby making them "clean," and it's frustrating that there isn't always the acknowledgement that the clean energy could displace other dirty generation.

Even with things like orphaned natural gas that gets flared otherwise - rescuing the energy is great but we could use it for many things, not just LLMs or bitcoin mining!

AnthonyMouse9 hours ago

> the clean energy could displace other dirty generation.

If you would have built 10GW of solar or nuclear to replace other generation and instead the data center operators provide funding to build 20GW so that 10GW can go to data centers, the alternative wasn't replacing any of the other dirty generation. And the economies of scale may give the non-carbon alternatives a better cost advantage so you can build even more.

oezi6 hours ago

In some markets this might be right but in others it isn't. For instance, if you have CO2 certificates associated with a product then not buying it won't change emissions. It will make the price of certificates cheaper for everyone else and lead to other consumption elsewhere.

designerarvid6 hours ago

Pricing is a good way to steer in this direction. We would have more salad waste if salad was free to bring home from the supermarket I’m sure.

adrianN23 hours ago

Yeah, but pricing signals are a good way of reaching those goals.

Scarblac23 hours ago

Businesses will only start doing that in significant amounts when carbon emissions are priced according to their environmental impact.

Lerc23 hours ago

I don't think it a given that reducing energy consumption is a required part of the transition.

Increasing demand can lead to stimulus of green energy production.

Uehreka14 hours ago

There’s no rule that increased demand will necessarily stimulate green energy production, only that it will stimulate energy production. And getting people to care about climate gets tougher, not easier, when energy demand goes up.

hermitShell9 hours ago

To do that we would need society to agree about what the emission cost is.

Making electricity so abundant and efficient is probably more solvable. You can’t solve stu… society

designerarvid6 hours ago

Agreed, that always been the key question for sustainability. Price is a fantastic mechanism, but negative externalities remain.

somewhereoutth12 hours ago

Indeed. However the problem with LLMs is that vast amounts of VC money are being thrown at them, in the [misplaced] hope of great returns. This results in a resource mis-allocation of biblical proportions, of which unnecessary carbon emissions are a part.

neves8 hours ago

Best article I've ever read about the energy needs of AI.

Impressive how Big Tech refuses to share data with society for collective decisions.

I'd also recommend the Data Vampires podcast series:

https://techwontsave.us/episode/241_data_vampires_going_hype...

https://techwontsave.us/episode/243_data_vampires_opposing_d...

https://techwontsave.us/episode/245_data_vampires_sacrificin...

https://techwontsave.us/episode/247_data_vampires_fighting_f...

dr_dshiv2 hours ago

Solving climate change will take a lot of energy.

I found this article to be a little too one sided. For instance, it didn’t talk about the 10x reductions in power achieved this past year — essentially how gpt4 can now run on a laptop.

Viz, via sama “The cost to use a given level of AI falls about 10x every 12 months, and lower prices lead to much more use. You can see this in the token cost from GPT-4 in early 2023 to GPT-4o in mid-2024, where the price per token dropped about 150x in that time period. Moore’s law changed the world at 2x every 18 months; this is unbelievably stronger.” https://blog.samaltman.com/three-observations

Juliate2 hours ago

It's still _additional_ power usage that

1/ did not exist before

2/ does not replace/reduce previous/other power (some, very much more critical and essential) usages.

3/ a LOT of tasks are still way more energy/time-efficiently done with regular existing methods (dedicated software, or even by hand), but are still asked/improperly routed to AI chatbots that ... statistically guess the answer.

thedevilslawyer3 hours ago

What's the energy imprint of the human doing the same work that AI is now going to do? If AI's imprint is less, what is the right thing for us to do?

ahtihn2 hours ago

Kill the human?

If you don't want to go there, it doesn't really matter how much energy the human uses because the human will just use the same energy to do something else.

zavec2 hours ago

Not necessarily. I think the point of comparison here is how much energy does AI use to e.g. generate a video, compared to the energy used not by the human themselves, but by running XYZ software on a computer with a beefy graphics card for however many hours it'd take a human to do the same work.

simonw14 hours ago

I found this bit interesting:

> In 2017, AI began to change everything. Data centers started getting built with energy-intensive hardware designed for AI, which led them to double their electricity consumption by 2023.

As we all know, the generative AI boom only really kicked into high gear in November 2022 with ChatGPT. That's five years of "AI" growth between 2017 and 2022 which presumably was mostly not generative AI.

mattnewton5 hours ago

2017 is the year after AlphaGo beats Lee Sedol and is when the attention is all you need paper was published. The writing was on the wall. OpenAI just found product market fit in november 2022, but the industry wasn't wandering aimlessly until then.

SoftTalker11 hours ago

People started using GPUs for ML at around that time.

NitpickLawyer4 hours ago

Meta was in the process of AI-ing everything on their stacks, with search, similarity, graphs, recommendation, etc. everywhere on their properties. Incidentally that's why they already had tens/hundreds of thousands of GPUs when the LLM craze hit, and why they were in a good place to work on llama and other stuff.

liendolucas1 hour ago

> ...and many buildings use millions of gallons of water (often fresh, potable water) per day in their cooling operations.

This is outrageous. People still struggle to access fresh water (and power), but hey "sustainability is all to our company" is always promoted as if something nice is being done on from the behemoth's sides. BS. What a waste of resources.

I truly condemn all this. To this day I do still refuse to use any of this technology and hope that all this ends in the near future. It's madness. I see this as nothing more than next-gen restrictive lousy search engines, and as many have pointed out ads are going to roll soon. The more people adopt it the worse will be for everyone.

I always emphasize this: 10-15 years ago I could find everything through simple web searches. Everything. In many cases even landing on niche and unexpected but useful and interesting websites. Today that is a difficult/impossible task.

Perhaps there is still room for a well-done traditional search engine (haven't tried Kagi but people in general do say nice things about it) to surface and take the lead but I doubt it, when hype arrives especially in the tech industry people follow blindly. There are still flourishing "ai" startups and from night to day everyone has become a voice or expert on the subject. Again: BS.

Traditional web engines and searches were absolutely just fine and quite impressed with their outputs. I remember it. What the heck has happened?

mentalgear1 day ago

When companies make ESG claims, sensible measurement and open traceability should always be the first proof they must provide. Without these, and validation from a credible independent entity such as a non-profit or government agency, all ESG claims from companies are merely PR puff pieces to keep the public at bay (especially in "AI").

stevage21 hours ago

esg?

JohnFen21 hours ago

Environmental/Social/Governance. From Wikipedia:

Environmental, social, and governance (ESG) is shorthand for an investing principle that prioritizes environmental issues, social issues, and corporate governance.

kkarakk14 hours ago

when has ESG not been FUD or a way to bypass sanctions from poorly thought out climate change targets?

xeox5382 hours ago

I believe we're currently seeing AI in the "mainframe" era, much like the early days of computing, where a single machine occupied an entire room and consumed massive amounts of power, yet offered less compute than what now fits in a smartphone.

I expect rapid progress in both model efficiency and hardware specialization. Local inference on edge devices, using chips designed specifically for AI workloads, will drastically reduce energy consumption for the majority of tasks. This shift will free up large-scale compute resources to focus on truly complex scientific problems, which seems like a worthwhile goal to me.

panstromek57 minutes ago

It sure seems like that to me. I was pretty impressed by how easily I could run small Gemma on 7 year old laptop and get a decent chat experience.

I can imagine that doing some clever offloading to a normal programs and using the LLM as a sort of "fuzzy glue" for the rest could improve the efficiency on many common tasks.

teekert14 hours ago

I wonder what the carbon footprint of all those ads is.

amelius14 hours ago

Not just the ads, but also the overconsumption which they cause.

__MatrixMan__10 hours ago

Not just overconsumption, but also waste due to supply chain fragility. If you can induce demand anywhere then supply has to do crazy things to keep up.

Ericson231414 hours ago

We really need to improve the power grid. I don't think about "A. I." very much, but I am glad that something is making us upgrade the grid.

lucb1e12 hours ago

> I am glad that something is making us upgrade the grid

A few big consumers in centralized locations isn't changing the grid as much as the energy transition from fuels to electricity is

Ericson231411 hours ago

The transition to electric vehicles in the US has been disappointing, to say the least.

lucb1e11 hours ago

I thought everyone over there has an air conditioning system on electricity?

Ericson23148 hours ago

Yes, but we already do. We need new electricity demand.

floxy12 hours ago

40% of electricity consumption in Virgina will be data centers in 2030?

Table A1 , PDF page 29:

https://www.epri.com/research/products/000000003002028905

Henchman2113 hours ago

I work in DCO, thats Data Center Operations if you’re not aware. I’ve tried explaining the amount of power used to my elderly mom; it isn’t easy! But here’s my best take:

The racks I am personally responsible for consume 17.2kW. That’s consistent across the year; sure things dip a bit when applications are shut down, but in general 17.2kW is the number. Presuming a national average of 1.2kW per home, each rack of equipment I oversee could potentially power 14 houses. I am responsible for hundreds of these racks, while my larger organization has many thousands of these racks in many locations worldwide.

I’ve found no other way to let the scale of this sink in. When put this way she is very clear: the price isn’t worth it to humanity. Being able to get, say, Door Dash, is pretty neat! But not at the cost of all our hoarded treasure and certainly not at the cost of the environment on the only planet we have access to.

The work done by AI will only ever benefit the people at the top. Because to be frank: they won’t share. Because the very wealthy have hoarding disorder.

jahewson6 hours ago

But it can’t really power “14 houses” because people in those houses are consuming external services such as those provided by your racks.

Unless your racks can only serve 14 customers.

jeffbee12 hours ago

It seems like you are having an emotional response to not understanding the general energy picture. For example, an A320 aloft uses the same energy as two thousand of your hypothetical racks (2.5 tons of kerosene per hour).

Each!

We are in no meaningful sense torching the biosphere to get AI.

globnomulous6 hours ago

> It seems like you are having an emotional response to not understanding the general energy picture.

This is condescending and rude. It also strikes me as obviously wrong. The 'response' isn't emotional in the slightest; it just explains the emotional and cognitive experience of acquiring understanding. It's a reasonable, well-reasoned explanation of the difficulty of intuitively grasping how much energy these data centers, and thus the companies that use them, consume -- and then of the shock many experience when it dawns on them.

> For example, an A320 aloft uses the same energy as two thousand of your hypothetical racks (2.5 tons of kerosene per hour).

> Each!

> We are in no meaningful sense torching the biosphere to get AI.

What exactly is the reasoning here? That airplanes use a lot of energy, just like data centers or compared to data centers, and therefore that AI isn't ecologically damaging -- or isn't "meaningfully" damaging, whatever that means? That's not just wrong. It's a nonsequitur.

There's a simpler, easier, better way to wrap our heads around the data, one that doesn't require false dichotomies (or something like that): humans are torching the biosphere both to get AI and to travel.

myaccountonhn1 hour ago

Well said. Thank you.

fallingknife5 hours ago

But if you substitute the pre-doordash system of calling up a restaurant and ordering delivery or take out, the energy savings aren't even 1%. One gallon of gas contains 34 kWh of energy, so if one delivery takes 0.5 gallons of gas, it uses enough energy to power one of your racks for an hour. How many doordash orders can be processed by one of your racks in an hour? It's got to be in the millions.

teleforce9 hours ago

> unprecedented and comprehensive look at how much energy the AI industry uses

Not sure about comprehensive claim here if end-to-end query chains were not considered.

For example the mobile wireless node (that're being used by the majority of the users) contribution to the energy consumption are totally ignored. The wireless power amplifier or PA for both sides of users and base-stations are notorious for their inefficiency being only less than than 50% in practice although in theory can be around 80%. Almost all of the current AI applications are cloud based not local-first thus the end users energy consumption and contribution are necessary.

Proofread05921 day ago

> When you ask an AI model to write you a joke or generate a video of a puppy, that query comes with a small but measurable energy toll and an associated amount of emissions spewed into the atmosphere. Given that each individual request often uses less energy than running a kitchen appliance for a few moments, it may seem insignificant.

> But as more of us turn to AI tools, these impacts start to add up. And increasingly, you don’t need to go looking to use AI: It’s being integrated into every corner of our digital lives.

Forward looking, I imagine this will be the biggest factor in increasing energy demands for AI: companies shoving it into products that nobody wants or needs.

xarope6 hours ago

I have zero use for the AI summary that google chooses to prepend to my search summary. I decided to fact check a few, and they were totally wrong in subtle ways, either changing a word or phrase and thus inverting the meaning, or else completely summarizing from a site which had no relevance to my search.

So yes, I'd like to disable this completely. Even if it's just a single birthday candle worth of energy usage.

Uehreka14 hours ago

In the short term perhaps, but even without carbon pricing the raw electricity prices will probably tamp down the enthusiasm. At someone point it’ll become cool for activist investors to demand to see ROI for AI features on earnings calls, and then the fat will get trimmed just like any other trend that goes too far.

I think the bigger underrated concern is if LLMs fall into an unfortunate bucket where they are in fact generally useful, but not in ways that help us decarbonize our energy supply (or that do, but not enough to offset their own energy usage).

Nition6 hours ago

> increasing energy demands for AI: companies shoving it into products that nobody wants or needs

I think of this a little every time Google gives me another result with the AI summary and no option for anyone to turn it off. Apparently worldwide there are 8+ billion searches every day.

keybored13 hours ago

Try to buy something that isn’t wrapped in three layers of plastic. Or that isn’t made of plastic itself. Then go to the checkout and see their “PSA” about how asking for a plastic bag to carry your plastic merchandise kills the planet.

I’m sorry. I’m being blocked by some mysterious force from understanding what “actual human” means. And I don’t know how to get you in contact with your car manufacturer. Would you like me to repeat my 20 step suggestion on how to troubleshoot “why does my shitty car put the A/C on freezer mode whenever “Indian Summer” tops the charts in Bulgaria”, but with more festive emojis?

johnny_________7 hours ago

I ponder this a lot, but the interface of "MIT technology Review" is unbearably overdesigned, its got that annoying narrow smartphone format where you can't zoom out, and then all these fancy graphics. Can't we have crisp, easy-to-read HTML? The format annoyed me so much I didn't read the article because this kind of design makes me doubt the source. Alas

myaccountonhn2 hours ago

The article works fine with the browser's reader-mode for me.

mg1 day ago

The brain uses 20% of the human body's energy.

I wouldn't be surprised if mankind will evolve similar to an organism and use 20% of all energy it produces on AI. Which is about 10x of what we use for software at the moment.

But then more AI also means more physical activity. When robots drive cars, we will have more cars driving around. When robots build houses, we will have more houses being built, etc. So energy usage will probably go up exponentially.

At the moment, the sun sends more energy to earth in an hour than humans use in a year. So the sun alone will be able to power this for the foreseeable future.

Scarblac23 hours ago

But the article says that energy use by AI is 48% more carbon intensive than the US average. So talk of solar power is a red herring -- that's not what it is running on now.

mg23 hours ago

I am thinking about the future here.

I don't think there will be much carbon intensive energy creation in a few decades from now. It does not make sense economically.

Scarblac23 hours ago

You said "for the foreseeable future", which I interpret as being about now.

Anyway I hope you're right, but so far global CO2 output is still growing. All the other energy has only come on top of carbon intensive energy, it hasn't replaced any of it. Every time we build more, we find new ways of spending that much energy and more.

mg21 hours ago

Seeing 20 years into the future is quite possible in some aspects.

I remember how me and my friends discovered email in 1999 and were like "Yay, in the future we'll all do this instead of sending letters!". And it took about 20 years until letters were largely replaced by email and the web. And when the first videos appeared on the web, it was quite clear to us that they would replace DVDs.

Similar with the advent of self driving cars and solar energy I think.

newtonsmethod10 hours ago

The energy use by AI probably is just as, if not more, carbon intensive, but the article never says that. It talks about the energy use of the general data center.

> The carbon intensity of electricity used by data centers was 48% higher than the US average.

AnthonyMouse8 hours ago

In case anyone is wondering why that is, it's because they put data centers in the places with the cheapest electricity. Which, in the US, is in places like Virginia and Ohio, where they burn fossil fuels.

If the people always talking about how cheap solar is want to fix this, find a way to make that cheapness actually make it into the customer's electric bill.

+1
ipdashc5 hours ago
HelloUsername24 hours ago

> energy usage will probably go up exponentially

kindof sounds like Jevons paradox? https://wiki.froth.zone/wiki/Jevons_paradox

asdff14 hours ago

This assumes no technological adaptions towards efficiency. Consider yourself walking a mile and the energy expenditure. It isn't insignificant. Now imagine you have a bicycle. Some bicyclists will train and do century rides, a distance that were never possible merely walking for a day. But these are few bikers overall, most will not maximize capability to that extent but will still be taking advantage of the efficiency of the bike.

vmg1224 hours ago

> When robots drive cars, we will have more cars driving around

This doesn't seem true. In SF, waymo with 300 cars does more rides than lyft with 45k drivers. If self driving cars interleave different tasks based on their routes I imagine they would be much more efficient per mile.

0_____021 hours ago

Is it really only 300 cars? They feel like they're everywhere!

simonw13 hours ago

I thought it was more than that too, but according to https://www.reuters.com/business/autos-transportation/alphab...

> With more than 700 vehicles in its fleet - 300 of which operate in San Francisco - Waymo is the only U.S. firm that runs uncrewed robotaxis that collect fares.

Those numbers are from April 2025.

floxy12 hours ago

>We’ve also incrementally grown our commercial fleet as we’ve welcomed more riders, with over 1,500 vehicles across San Francisco, Los Angeles, Phoenix, and Austin.

https://waymo.com/blog/2025/05/scaling-our-fleet-through-us-...

floxy12 hours ago

>This doesn't seem true.

Seems like we are way too early in the adoption curve to tell. Currently the average number of passengers per trip is >1.0 across the whole fleet. Some day, I'd expect that to dip below 1.0, as people send an empty car to pick up the dog from the vet, or circle the block to avoid having to pay for parking, etc.

thelastgallon11 hours ago

Thank you for this data point. It massively lowers the embodied carbon footprint (carbon from manufacturing, supply chain, transportation, etc.). Operational carbon is a solved problem; it is easy to measure and can be supplied from renewable sources.

asdff14 hours ago

If waymo is doing more rides with 300 cars than 45k drivers on lyft, we can assume then that waymo cars are on the road serving customers at least 150x as long of time as a lyft driver. So yes it could really mean more cars are "around" even if the fleet is much smaller.

mg23 hours ago

Existing rides will be done more efficiently but since rides are so much cheaper without a driver, much more rides will be done.

A car driving from A to B will cost less than 50% of the current price. Which will unlock a huge amount of new rides.

fallingknife5 hours ago

That would mean that 5.5% of the SF population are lyft drivers

amelius1 day ago

One problem: all this energy is eventually turned into heat ...

mg24 hours ago

Most of the sunlight that hits a roof is already turned into heat. Whether you use that for calculations or not does not make a difference.

Not sure about the exact numbers, but I guess that at the moment normal roofs and solar panels absorb very roughly about the same percentage of sunlight.

So if in the future solar panels become more efficient, then yes, the amount of sunlight turned into heat could double.

Maybe that can be offset by covering other parts of earth with reflective materials or finding a way to send the heat back into the universe more effectively.

amelius22 hours ago

What if you put a solar farm in a desert, though?

And also, people should paint their roofs white.

carunenjoyerlp4 hours ago

>And also, people should paint their roofs white.

Some could, but a dark roof can be beneficial in the winter

briandear23 hours ago

Why not nuclear?

natmaka6 hours ago

Because a mix of renewables deployed on a continent (most grids tend nowadays to extend on whole continents because it enhances continuity of service while offering many ways to optimize for emissions, costs...) is better (cheaper, less dependency towards any fuel, less risk related to accidents/neglects/mistakes/war/terrorism/hot waste/... ...).

GAFAM nuclear are mere announcements, intentions.

On the other front most already progress. https://www.reuters.com/sustainability/climate-energy/micros...

https://www.reuters.com/sustainability/climate-energy/micros...

mg23 hours ago

Building and running a nuclear reactor involves a lot of physical activity. And if the past is an indicator, we always move from physical activity to the flow of electrons.

The discussion about nuclear vs solar remind me of the discussions about spinning HDs versus solid state drives when they were new.

carunenjoyerlp22 hours ago

HDDs build the backbone of all large storage systems, they serve many purposes today. Magnetic tape is still in use too

Quenby9 hours ago

With today’s AI systems, we still have very little visibility into their actual energy costs. As we push for larger models and faster responses, it’s worth asking whether we’re unintentionally accelerating energy use and emissions.

Finding the balance between innovation and long-term responsibility feels more important than ever.

jeffbee11 hours ago

This series of articles is driving me insane. The authors or editors are using inappropriate units to shock readers: billions of gallons, millions of square feet. But they are not putting the figures into context that the reader can directly comprehend. Because if they said the Nevada data centers would use 2% as much water as the hay/alfalfa industry in Nevada then the entire article loses its shock value.

giantg222 hours ago

With all the issues and inefficiencies listed, there is a lot of room for improvement. I'm hopeful that just as the stat they give for data center energy not rising from 2005-2017, so to will the AI energy needs flatten in a few years. GPUs are not very efficient. Switching to more task specific hardware will provide more efficiency eventually. This is already happening a little with stuff like TPUs.

mNovak9 hours ago

This gives me the silly idea to go try to measure the power consumption of the local data center by measuring the magnetic field coming off the utility lines.

scottcha9 hours ago

Shameless plug . . . I run a startup who is working to help this https://neuralwatt.com We are starting with an os level (as in no model changes/no developer changes required) component which uses RL to run AI with a ~25% energy efficiency improvement w/out sacrificing UX. Feel free to dm me if you are interested in chatting either about problems you face with energy and ai or if you'd like to learn more.

emushack22 hours ago

I would like to see more data centers make use of large-scale Oil Immersion-Cooling. I feel like the fresh water use for cooling is a huge issue.

https://par.nsf.gov/servlets/purl/10101126

lxgr14 hours ago

There are various data center cooling techniques, and not all of them use water. As a result, water usage, and water usage efficiency, vary wildly:

https://dgtlinfra.com/data-center-water-usage/

https://datacenters.microsoft.com/sustainability/efficiency/

carunenjoyerlp22 hours ago

Isn't water just the transfer medium between the server and the heat exchangers outside? How would changing that to oil help?

chneu17 hours ago

It wouldn't really help.

Oil might be able to carry more heat but it's more expensive to use.

Oil immersion is something nerds like to think is amazing but it's just a pain in the ass for negligible benefits. Imagine the annoyance of doing maintenance.

asdff14 hours ago

Wouldn't it be no different but your hands get a little oily? Say you take out a ram stick, oil goes into the empty dimm slot, but so what because its displaced again when you put in the new ram stick.

carunenjoyerlp22 hours ago

>you might think it’s like measuring a car’s fuel economy or a dishwasher’s energy rating: a knowable value with a shared methodology for calculating it. You’d be wrong.

But everyone knows fuel economy is everything but a knowable value. Everything from if it has rained in the past four hours to temperature to loading of the vehicle to the chemical composition of the fuel (HVO vs traditional), how worn are your tires? Are they installed the right way? Are your brakes lagging? The possibilities are endless. You could end up with twice the consumption.

By the way, copy-pasting from the website is terrible on desktop firefox, the site just lags every second, for a second.

GuinansEyebrows13 hours ago

fuel economy, like blood glucose levels, is impacted by many factors, but you can measure it over time. you might not be able to prescribe a course of action but you can make corrections to the course you're already on.

est3123 hours ago

I wonder how the energy requirements are distributed between training and inference. Training should be extremely flexible, so one can only train when the sun shines and nobody uses the huge amount of solar power, or only when the wind turbines turn.

jnsaff223 hours ago

AFAICT the energy cost of training is still fairly low compared to cost of GPU's themselves so especially during a land grab it's important to drive as near as possible full utilization of the GPU's, energy be damned.

I doubt this is going to change.

That said, the flip side of energy cost being not a big factor is that you could probably eat the increase of energy cost by a factor of say 2 and this could possibly enable installation of short term (say 12h) battery storage to enable you to use only intermittent clean energy AND drive 100% utilization.

blkhawk13 hours ago

the numbers in the article are all over the place. I mean the article seems to try and some of the more general calculations on paper should work out but especially the image gen ones I can sorta disprove with my own experiences in local gen.

Even were it matches sorta (the 400 feet e-bike thing) that only works out for me because I use an AMD card. An NVIDIA card can have several times the generation speed at the same power draw so it all falls down again.

And the parameters they tried to standardize their figures with (the 1024x1024 thing) is also a bit meh because the SAME amount of pixels in a different aspect ratio can have huge variations in gen speed and thus power usage. for instance for most illustrious type checkpoints the speed is about 60% higher at aspect ratios other than 1024x1024. Its all a bit of a mess.

phillipcarter14 hours ago

Well, this was disappointing:

> There is a significant caveat to this math. These numbers cannot serve as a proxy for how much energy is required to power something like ChatGPT 4o.

Otherwise this is an excellent article critiquing the very real problem that is opacity of these companies regarding model sizes and deployments. Not having an honest accounting of computing deployed worldwide is a problem, and while it's true that we didn't really do this in the past (early versions of Google searches were undoubtedly inefficient!), it's not an excuse today.

I also wish this article talked about the compute trends. That is, compute per token is going significantly down, but that also means use of that compute can spread more. Where does that lead us?

acidburnNSA14 hours ago

I'm glad that the needs of AI and the sustainable capabilities of nuclear fission go well together.

briandear23 hours ago

What’s the net energy footprint of an employee working in an office whose job was made redundant by AI? Of course that human will likely have another job, but what’s the math of a person who was doing tedium solved by AI and now can do something more productive that AI can’t necessarily do. In other words, let’s calculate the “economic output per energy unit expended.”

On that note, what’s the energy footprint of the return to office initiatives that many companies have initiated?

lm2846921 hours ago

> a person who was doing tedium solved by AI and now can do something more productive that AI can’t necessarily do.

Like driving a uber or delivering food on a bicycle ? Amazing!

folkrav23 hours ago

> Of course that human will likely have another job, but what’s the math of a person who was doing tedium solved by AI and now can do something more productive that AI can’t necessarily do

That’s a lot of big assumptions - that the job getting replaced was tedious in the first place, that those other “more productive” job exists, that the thing AI can’t necessarily do will stay that way long enough for it not to be taken over by AI as well, that the tediousness was not part of the point (e.g. art)…

Scarblac23 hours ago

When human civilization crashes due to yearly climate change caused famines it won't matter how useful the work done by the AIs was.

carunenjoyerlp22 hours ago

Net energy change of people doing work on their desk versus browsing the internet versus playing games, you will likely not see difference at all. They're all at rest, more or less thinking something. People at home sofa always have metabolic processes running regardless of whether it produces additional value to some corporation

protocolture10 hours ago

Weird I was assured that Bitcoin would be using all of the worlds electricity by now.

Which I already thought was odd, because London would need all that electricity to see through the giant mountain of poop piled up by all the horses the british use for transportation.

LordDragonfang4 hours ago

> The new model uses [energy] equivalent to riding 38 miles on an e-bike... AI companies have defended these numbers saying that generative video has a smaller footprint than the film shoots and travel that go into typical video production. That claim is hard to test and doesn’t account for the surge in video generation that might follow if AI videos become cheap to produce.

"Hard to test", but very obviously true if you make any attempt at guessing based on making a few assumptions... like they seem comfortable doing for all the closed source models they don't have access to being run in conditions they're not testing for. Especially considering they're presenting their numbers as definitive, and then just a couple paragraphs down admit that, yeah, they're just guessing.

Regardless, I know for a fact that a typical commercial shoot uses way more energy than driving across the TMZ in an e-bike (considering they're definitely using cars to transport gear, which gives you less than 4 miles for the same energy).

fred6913 hours ago

Might have missed it but was disappointed to see no mention of externalized costs like the scraping burden imposed on every IP-connected server. From discussions on HN this sounds quite substantial. And again, why exactly should the few AI companies reap all the value when other companies and individuals are incurring costs for it?

djoldman12 hours ago

> This leaves even those whose job it is to predict energy demands forced to assemble a puzzle with countless missing pieces, making it nearly impossible to plan for AI’s future impact on energy grids and emissions. Worse, the deals that utility companies make with the data centers will likely transfer the costs of the AI revolution to the rest of us, in the form of higher electricity bills.

... So don't? Explicitly shift the cost to the customer.

If I want to hook up to the energy grid with 3-phase power, I pay the utility to do it.

If a business wants more power and it isn't available, then the business can pay for it.

Then only businesses that really need it will be willing to step up to the plate.

No amount of "accounting" or "energy needs prediction" will guard against regulatory capture.

mark_l_watson23 hours ago

The book “AI Atlas” covers the energy and other costs of AI.

simonw13 hours ago

You mean this one? https://en.m.wikipedia.org/wiki/Atlas_of_AI

It's from 2021 so won't cover the 2022-onwards generative AI boom.

From the Wikipedia summary it sounds like it's about machine learning algorithms like classification, AlphaGo and concerns about ethics of training and bias.

emushack22 hours ago

Link?

paulcole7 hours ago

Are these the same people who claimed that crypto was going to use more energy than the entire world by 2030?

kleiba2 hours ago

This reminds me of the quote: "Félix L'Herbier learns that there are more links in his brain than atoms in the universe."

tempfile2 hours ago

Jesus, who writes this stuff?

> AI is unavoidable

> We will speak to models in voice mode, chat with companions for 2 hours a day, and point our phone cameras at our surroundings in video mode

This is surely meant to be an objective assessment, not a fluff piece.

bbor13 hours ago

Interesting, thanks for sharing! I share some concerns others have about this piece, but I’m most shocked about their finding that image generation is cheaper than text. As someone who’s gone down this rabbit hole multiple times, this runs against every single paper I’ve ever cited on the topic. Anyone know why? Maybe this is a recent change? It also doesn’t help that multimodal transformers are now blurring the lines between image and text, of course… this article doesn’t even handle that though, treating all image models as diffusion models.

Lerc21 hours ago

The point that stood out to me as concerning was

"The carbon intensity of electricity used by data centers was 48% higher than the US average."

I'd be fine with as many data centers as they want if they stimulated production of clean energy to run them.

But that quote links to another article by the same author. Which says

"Notably, the sources for all this power are particularly “dirty.” Since so many data centers are located in coal-producing regions, like Virginia, the “carbon intensity” of the energy they use is 48% higher than the national average. The paper, which was published on arXiv and has not yet been peer-reviewed, found that 95% of data centers in the US are built in places with sources of electricity that are dirtier than the national average. "

Which in turn links to https://arxiv.org/abs/2411.09786

Which puts the bulk of that 48% higher claim on

"The average carbon intensity of the US data centers in our study (weighted by the energy they consumed) was 548 grams of CO2e per kilowatt hour (kWh), approximately 48% higher than the US national average of 369 gCO2e / kWh (26)."

Which points to https://ourworldindata.org/grapher/carbon-intensity-electric...

For the average of 369g/KWh. That's close enough to the figure in the table at https://www.epa.gov/system/files/documents/2024-01/egrid2022...

which shows 375g/KWh (after converting from lb/MWh)

But the table they compare against shows.

    VA 576g/KWh
    TX 509g/KWh
    CA 374g/KWh
and the EPA table shows

    VA 268g/KWh
    TX 372g/KWh
    CA 207g/KWh
Which seem more likely to be true. The paper has California at only marginally better than the national average for renewables (Which I guess they needed to support their argument given the number of data centers there)

I like arxiv, It's a great place to see new ideas, the fields I look at have things that I can test myself to see if the idea actually works. I would not recommend it as a source of truth. Peer review still has a place.

If they were gathering emissions data from states themselves, they should have caclulated the average from that data, not pulled the average from another potentially completely different measure. Then their conclusions would have been valid regardless what weird scaling factor they bought in to their state calculations. The numbers might have been wrong but the proportion would have been accurate, and it is the proportion that is being highlighted.

GuinansEyebrows13 hours ago

there are still negative externalities to high renewable-energy usage (heat and water usage, which itself requires energy to purify once returned to the sewer, plus the environmental impact of building an enormous heat island in places where there was little industry previously).

mitch_said24 hours ago

[dead]

gitroom10 hours ago

[dead]

sfpityparty14 hours ago

[flagged]

cco11 hours ago

Today Google launched a model, Gemma 3n, that performs about as good as SOTA models from 1-2 years ago that runs locally on a cell phone.

Training SOTA models will, like steel mills or other large industrial projects, require a lot of environmental footprint to produce. But my prediction is that over time the vast majority of use cases in the hands of users will be essentially run on device and be basically zero impact, both in monetary cost and environment.