Framing it in gigawatts is very interesting given the controversy about skyrocketing electric prices for residential and small business users as a result of datacenters over the past three years, primarily driven by AI growth. If, as another commenter notes, this 10GW is how much Chicago and NYC use combined, then we need to have a serious discussion about where this power is going to come from given the dismal status of the USA's power grid and related infrastructure and the already exploding costs that have been shifted to residential users in order to guarantee electric supply to these biggest datacenters (so they can keep paying peanuts for electricity and avoid shouldering any of the infrastructural burden to maintain or improve the underlying grid/plants required to meet their massive power needs).
I'm not even anti-datacenter (wouldn't be here if I were), I just think there needs to be serious rebalancing of these costs because this increase in US residential electric prices in just five years (from 13¢ to 19¢, a ridiculous 46% increase) is neither fair nor sustainable.
So where is this 10GW electric supply going to come from and who is going to pay for it?
Source: https://fred.stlouisfed.org/series/APU000072610
EDIT:
To everyone arguing this is how DCs are normally sized: yes, but normally it's not the company providing the compute for the DC owner that is giving these numbers. nVidia doesn't sell empty datacenters with power distribution networks, cooling, and little else; nVidia sells the GPUs that will stock that DC. This isn't a typical PR netnewswire bulletin "OpenAI announces new 10GW datacenter", this is "nvidia is providing xx compute for OpenAI". Anyway, all this is a segue from the question of power supply, consumption, grid expansion/stability, and who is paying for all that.
I work in the datacenter space. The power consumption of a data center is the "canonical" way to describe their size.
Almost every component in a datacenter is upgradeable—in fact, the compute itself only has a lifespan of ~5 years—but the power requirements are basically locked-in. A 200MW data center will always be a 200MW data center, even though the flops it computes will increase.
The fact that we use this unit really nails the fact that AI is basically refining energy.
A 200MW data center will always be a 200MW data center, even though the flops it computes will increase.
This here underscores how important TSMC's upcoming N2 node is. It only increases chip density by ~1.15x (very small relative to previous nodes advancements) but it uses 36% less energy at the same speed as N3 or 18% faster than N3 at the same energy. It's coming at the right time for AI chips used by consumers and energy starved data centers.N2 is shaping up to be TSMC's most important node since N7.
> N2 is shaping up to be TSMC's most important node since N7
Is it?
N2, from an energy & perf improvement seems on par with any generation node update.
N2:N3 N3:N5 N5:N7
Power ~30% ~30% ~30%
Perf ~15% ~15% ~15%
https://www.tomshardware.com/news/tsmc-reveals-2nm-fabricati...Yes. It has more tape outs at this stage of development than both N5 or N3. It’s wildly popular for chip designers it seems.
I thought Apple gets exclusive access to the latest node for the first 1-2 years. Is that not the case?
No. That's not the case. Maybe for a few months only.
Correct me if I'm wrong but didn't TSMC launch N3 in 2022, and still only Apple uses this latest/smallest node.
Both AMD and NVIDIA are using N4.
Apple, Mediatek, Qualcomm, Intel
I love that term "refining energy". We need to plan for massive growth in electricity production to have the supply to refine.
Sounds smart but it’s abusing the semantics of “refine” and is therefore ultimately vacuous.
I think it is really just the difference between chemically refining something and electrically refining something.
Raw AC comes in, then gets stepped down, filtered, converted into DC rails, gated, timed, and pulsed. That’s already an industrial refinement process. The "crude" incoming power is shaped into the precise, stable forms that CPUs, GPUs, RAM, storage, and networking can actually use.
Then those stable voltages get flipped billions of times per second into ordered states, which become instructions, models, inferences, and other high-value "product."
It sure seems like series of processes for refining something.
It is the opposite of refining energy. Electrical energy is steak, what leaves the datacenter is heat, the lowest form of energy that we might still have a use for in that concentration (but most likely we are just dumping it in the atmosphere).
Refining is taking a lower quality energy source and turning it into a higher quality one.
What you could argue is that it adds value to bits. But the bits themselves, their state is what matters, not the energy that transports them.
I think you're pushing the metaphor a bit far, but the parallel was to something like ore.
A power plant "mines" electron, which the data center then refines into words. or whatever. The point is that energy is the raw material that flows into data centers.
Maybe more like converting energy to data, as a more specific type of refinement.
Using energy to decrease the entropy of data. Or to organize and structure data.
I like that. Take random wild electrons and put them neatly into rows & columns where they can sit a spell.
This is OpenAI, they are not decreasing the entropy. This is refining coal into waste heat and CO2.
All life is basically refining energy - standing up to entropy and temporarily winning the fight.
It's all about putting the entropy somewhere else and keeping your own little area organised.
People of the earth, remember: unnecessary arm and leg movements increase the entropy! Fear of the heat death of the universe! Lie down when possible!
Yes, in a very local context it appears so, but net entropy across the system from life's activities is increased
"the purpose of life is to hydrogenate carbon dioxide"
-- Michael Russel
Where do the cards go after 5 years? I don't see a large surplus of mid sized cloud providers coming to buy them (cause AI isn't profitable), Maybe other countries (possibly illegally)? Flood the consumer market with cards they can't use? TSMCs' more than doubled packaging and they are planning on doubling again
This.
A local to me ~40W datacenter used to be in really high demand, and despite having excess rack space, had no excess power. It was crazy.
Yeah, it was the companies pilot site, and everything about it is tiny.
But it very quickly became the best place in town for carrier interconnection. So every carrier wanted in.
Even when bigger local DC's went in, a lot of what they were doing was just landing virtual cross connects to the tiny one, because thats where everyone was.
You lost a M or K next to your W.
I still have an Edison bulb that consumes more power.
Yep I see that haha.
> the power requirements are basically locked-in
Why is that? To do with the incoming power feed or something else?
Basically, yes. When you stand up something that big, you need to work with the local utilities to ensure they have the capacity for what you're doing. While you can ask for more power later on, if the utilities can't supply it or the grid can't transport it, you're SOL.
You could in theory supplement it with rooftop solar and batteries, especially if you can get customers who can curtail their energy use easily. Datacentres have a lot of roof space, they could at least reduce their daytime energy costs a bit. I wonder why you don't see many doing solar, do the economics not work out yet?
I'd have to do the math, but I doubt that makes sense given the amount of power these things are drawing. I've heard of DCs having on-site power generation, but it's usually in the form of diesel generators used for supplemental or emergency power. In one weird case, I heard about a DC that used on-site diesel as primary power and used the grid as backup.
Compared to their volume they absolutely do not: you get about ~1kW / m^2 of solar. Some quick googling suggests a typical DC workload would be about 50 kW / m^2, rising too 100 for AI workloads.
Cooling too. A datacenter that takes 200MW in has to dissipate 200MW of heat to somewhere.
guessing massive capital outlays and maybe irreversible site selection/preparation concerns.
That's pretty interesting. Is it just because the power channels are the most fundamental aspect of the building? I'm sorta surprised you can't rip out old cables and drop in new ones, or something to that effect, but I also know NOTHING about electricity.
Not an expert, but it’s probably related to cooling. Every joule of that electricity that goes in must also leave the datacenter as heat. And the whole design of a datacenter is centered around cooling requirements.
Exactly. To add to that, I'd like to point out that when this person says every joule, he is not exaggerating (only a teeny tiny bit). The actual computation itself barely uses any energy at all.
Refining it into what? Stock prices?
DC infra is always allocated in terms of watts. From this number, everything else is extrapolated (e.g. rough IT load, cooling needed, etc).
> is neither fair nor sustainable
That's half what I pay in Italy, I'm sure the richest country in the world will do fine.
>I'm sure the richest country in the world will do fine.
You underestimate how addicted the US is to cheap energy and how wasteful it is at the same time.
Remember how your lifestyle always expands to fill the available resources no matter how good you have it? Well if tomorrow they'd have to pay EU prices, the country would have a war.
When you lived your entire life not caring about the energy bill or about saving energy, it's crippling to suddenly have scale back and be frugal even if that price would still be less than what other countries pay.
It's hard to appreciate the difference in 'abundance mentality' between the median US and EU person. It always struck me as an interesting culture difference. While both EU and US grew in prosperity post WWII, I feel the US narrative was quite on another level.
Here in Belgium a stupid amount of that bill is hidden taxes. i kind of assume it's similar in Italy.
We import most of our energy, that's really it.
And the substantial increase in profits for all providers, which isn't comparable to that of our neighbours. Our disposable income in Belgium really exists to subsidise energy companies, supermarkets, and a pathetic housing market.
> So where is this 10GW electric supply going to come from and who is going to pay for it?
I would also like to know. It's a LOT of power to supply. Nvidia does have a ~3% stake in Applied Digital, a bitcoin miner that pivoted to AI (also a "Preferred NVIDIA Cloud Partner") with facilities in North Dakota. So they might be involved for a fraction of those 10GW, but it seems like it will be a small fraction even with all the planned expansions.
https://www.investopedia.com/applied-digital-stock-soars-on-...
https://ir.applieddigital.com/news-events/press-releases/det...
> Framing it in gigawatts is very interesting given the controversy
Exactly. When I saw the headline I assumed it would contain some sort of ambitious green energy build-out, or at least a commitment to acquire X% of the energy from renewable sources. That's the only reason I can think to brag about energy consumption
Or this brings power and prestige to the country that hosts it. And it gives clout precisely because it is seemingly wasteful. Finding the energy is a problem for the civilian government who either go "drill baby drill" or throw wind/solar/nuclear at the problem.
Datacenters need to provide their own power/storage, and connect to the grid just to trade excess energy or provide grid stability. Given the 5-7 year backlog of photovoltaic projects waiting for interconnect, the grid is kind of a dinosaur that needs to be routed around
“ skyrocketing electric prices for residential and small business users as a result of datacenters over the past three years”
This is probably naïve. Prices skyrocketed in Germany for similar reasons before AI data centers were a thing.
Watt is the hottest new currency in big tech. Want to launch something big? You don't have to ask for dollars or headcount or servers or whatever else used to be the bottleneck in the past. There's plenty of all this to go around (and if not it can be easily bought). Success or failure now depends on whether you can beg and plead your way to getting a large enough kilowatt/megawatt allocation over every other team that's fighting for it. Everything is measured this way.
Explains why Meta is entering power trading space
https://subscriber.politicopro.com/article/eenews/2025/09/22...
That gives me Enron vibes, even though these are vastly different situations. But the idea of a social media company trading in this space is nuts.
I had my highest power bill last month in 4 years, in a month that was unseasonably cool so no AC for most of the month. Why are we as citizens without equity in these businesses subsidizing the capital class?
To me, the question is less about “how do we make more energy” and more about “how do we make LLMs 100x more energy efficient.” Not saying this is an easy problem to solve, but it all seems like a stinky code smell.
I'm pretty confident that if LLMs were made 100x more energy efficient, we would just build bigger LLMs or run more parallel inference. OpenAI's GPT-5 Pro could become the baseline, and their crazy expensive IMO model could become the Pro offering. Especially if that energy efficiency came with speedups as well (I would be surprised if it didn't). The demand for smarter models seems very strong.
This feels like a return to the moment just before Deepseek when the market was feeling all fat and confident that "more GPUs == MOAR AI". They don't understand the science, so they really want a simple figure to point to that means "this is the winner".
Framing it in GW is just giving them what they want, even if it makes no sense.
An 8% increase y/o/y is quite substantial, however keep in mind globally we experienced the 2022 fuel shock. In Australia for example we saw energy prices double that year.
Although wholesale electricity prices show double-digit average year-on-year swings, their true long-run growth is closer to ~6% per year, slightly above wages at ~4% during the same period.
So power has become somewhat less affordable, but still remains a small share of household income. In other words, wage growth has absorbed much of the real impact, and power prices are still a fraction of household income.
You can make it sound shocking with statements like “In 1999, a household’s wholesale power cost was about $150 a year, in 2022, that same household would be charged more than $1,000, even as wages only grew 2.5x”, but the real impact (on average, obviously there are outliers and low income households are disproportionately impacted in areas where gov doesn’t subsidise) isn’t major.
https://www.aer.gov.au/industry/registers/charts/annual-volu...
I wouldn’t call a $100-270 electric bill a “fraction” when it’s about 5% post tax income. I use a single light on a timer and have a small apartment
Especially since these sorts of corporations can get tax breaks or har means of getting regulators to allow spreading the cost. Residential shouldn’t see any increase due to data centers, but they do, and will, supplement them while seeing minimal changes to infrastructure
When people are being told to minimize air conditioning but then these big datacenters are made and aren’t told “reduce your consumption” then it doesn’t matter how big or small the electric bill is, it’s supplementing a multi billion dollar corporation’s toy
6% YoY is much higher than the 2-3% inflation target
So a 6.6x increase in power bill, offset by a 2.5x wage increase has no major impact?
I'm sure none of the other outgoings for a household saw similar increases. /s
0,19 per kwh. Damn man, here it is like 0,97 per kwh (Western Europe) … stop complaining
Regulated price in France:
- 0,1952 per kWh for uniform price.
- 0,1635 / 0,2081 for day/nigh pricing
- 0,1232 /... / 0,6468 for variable pricing
https://particulier.edf.fr/content/dam/2-Actifs/Documents/Of...
You have a very bad deal if you pay 0.97€ per kWh.
This is not true. The average in the EU is 0,287 €/kWh. I pay 0,34 €/kWh in Berlin.
And in Germany the price includes transmission and taxes, it's the consumer end price. You have to remember that some countries report electricity price without transmission or taxes, also in consumer context, so you need to be careful with comparisons.
DCs need to align their training cycles with the peak of renewable power generation
They are starting to include batteries so they dont have to adjust to external factors
Utilities always need to justify rate increases with the regulator.
The bulk of cost increases come from the transition to renewable energy. You can check your local utility and see.
It’s very easy to make a huge customer like a data center directly pay the cost needed to serve them from the grid.
Generation of electricity is more complicated, the data centers pulling cheap power from Colombia river hydro are starting to compete with residential users.
Generation is a tiny fraction of electricity charges though.
I actually don’t like this measurement, as it’s vague and dilutes the announcement. Each product has a different efficiency of watts.
Imagine Ford announced “a strategic partnership with FedEx to deploy 10 giga-gallons of ICE vehicles”
It’s a sticky metric though because Moores law per power consumption died years ago.
Prices of _everything_ went up over the past five years. Datacenter expansion was far from the main driver. Dollars and cents aren't worth what they used to be.
Elsewhere it was mentioned that DCs pay less for electricity per Wh than residential customers. If that is the case, then it's not just about inflation, but also unfair pricing putting more of the infrastructure costs on residential customers whereas the demand increase is coming from commercial ones.
Industrial electricity consumers pay lower unit rates per kWh, but they also pay for any reactive power that they consume and then return -- residential consumers do not. As in, what industrial consumers actually pay is a unit cost per kVAh, not kWh.
This means loads with pretty abysmal power factors (like induction motors) actually end up costing the business more money than if they ran them at home (assuming the home had a sufficient supply of power).
Further, they get these lower rates in exchange for being deprioritised -- in grid instability (e.g. an ongoing frequency decline because demand outstrips available supply), they will be the first consumers to be disconnected from the grid. Rolling blackouts affecting residential consumers are the last resort.
There are two sides to this coin.
Note that I am in no way siding with this whole AI electricity consumption disaster. I can't wait for this bubble to pop so we can get back to normality. 10GW is a third of the entire daily peak demand of my country (the United Kingdom). It's ridiculous.
Edit: Practical Engineering (YouTube channel) has a pretty decent video on the subject. https://www.youtube.com/watch?v=ZwkNTwWJP5k
I mean gigawatts is a concise metric to get a grasp of the amount of gpu compute they install, but the honesty seems a bit strange to me imo.
Total gigawatts is the maximum amount of power that can be supplied from the power generating station and consumed at the DC through the infrastructure and hardware as it was built.
Whether they use all those gigawatts and what they use them for would be considered optional and variable from time to time.
> So where is this 10GW electric supply going to come from
If the US petro-regime wasn't fighting against cheap energy sources this would be a rounding error in the country's solar deployment.
China deployed 277GW of solar in 2024 and is accelerating, having deployed 212GW in the first half of 2025. 10 GW could be a pebble in the road, but instead it will be a boulder.
Voters should be livid that their power bills are going up instead of plummeting.
Fyi capacity announced is very far from the real capacity when dealing with renewables. It's like saying that you bought a Ferrari so now you can drive at 300km/h on the road all of the time.
In mid latitudes, 1 GW of solar power produces around 5.5 GWh/day. So the "real" equivalent is a 0.23 GW gas or nuclear plant (even lower when accounting for storage losses).
But "China installed 63 GW-equivalent" of solar power is a bit less interesting, so we go for the fake figures ;-)
You think they don't know that too? You can bet they're investing heavily in grid-level storage too.
I was commenting the initial number announcement. And storage at this scale right now doesn't exist. The most common way, water reservoirs, requires hard-to-find sites that are typically in the Himalaya, so far away from the production place. And the environmental cost isn't pretty either.
I'm living in one of the most expensive electricity markets in the US. It has a lot more to do with the state shutting down cheap petro energy (natural gas) and nuclear then replacing it with... tbd.
How would that solar power a DC at night or on a cloudy day? Energy storage isn’t cheap.
In 2025 it’s cheaper to demolish an operating coal plant and replace it with solar and battery, and prices are still dropping.
Why aren't all these businesses doing that then?
Theoretically couldn't you use all the waste heat from the data center to generate electricity again, making the "actual" consumption of the data center much lower?
Given that steam turbine efficiency depends on the temperature delta between steam input and condenser, unlikely unless you're somehow going to adapt Nvidia GPUs to run with cooling loop water at 250C+.
Thermodynamics says no. In fact you have to spend energy to remove that heat from the cores.
(Things might be different if you had some sort of SiC process that let you run a GPU at 500C core temperatures, then you could start thinking of meaningful uses for that, but you'd still need a river or sea for the cool side just as you do for nuclear plants)
In the Nordics the waste heat is used for district heating. This practical heat sink really favors northern countries for datacenter builds. In addition you usually get abundant water and lower population density (meaning easier to build renewables that have excess capacity).
No.
> letter of intent for a landmark strategic partnership
> intends to invest up to xxx progressively
> preferred strategic compute and networking partner
> work together to co-optimize their roadmaps
> look forward to finalizing the details of this new phase of strategic partnership
I don't think I have seen so much meaningless corporate speak and so many outs in a public statement. "Yeah we'll maybe eventually do something cool".
NVDA's share price enjoyed a nice $6 bump today, so the announcement did what it was supposed to do.
In a sense, it's just an ask to public investors for added capital to do a thing, and evidently a number of investors found the pitch compelling enough.
Increase in share price doesn't provide extra cash to a company. They'd have to issue new shares for that.
It doesn't directly, but it helps because they can do deals where they buy things with stock, like people's labor or small companies, and now that "money" is more valuable.
It does help with employee stock compensation. If your stock doubled in the past year, then you just need to dole out 50% of shares as last year in equity refreshers to retain talent.
Nvidia probably has the opposite problem - employee stock has appreciated so much that you have to convince them not to retire.
Maybe but people's spending also dramatically goes up as they start making more money. You buy that $5m vacation home at Tahoe, you buy fully-loaded Rivian SUVs, you send your kids to expensive private schools, you fly only first-class on family vacations, and you are back to needing to work more to sustain this lifestyle.
This assumes your staff are not a bunch of boglehead freaks constantly on blind and crunching spreadsheets and grinding their leetcode for that perfectly timed leap.
RSU vesting is a bit like options. You have the option but not the obligation to stay in the job!
But company owns stock right? So they can sell those stocks no?
It can, but investors don't like that since it dilutes the value of their own shares. Which is why large companies usually do the opposite - share buybacks. Nvidia in fact bought $24 billion worth of its own shares in the first half of 2025, and plans to spend $60 billion more in buybacks in upcoming months.
Which investors also usually don't like. It says "we have all this cash, but we have no idea what to do with it so we are buying out own stock". While I'd expect a company to actually invest (into research, tech, growth etc.) with it's excess cash to make more money in the future.
Preferred by some to dividends.
If stock buybacks cause the price to go up like it should in theory, that's less of a tax hit than dividends! I'll take it
That has to be compared with how much stock the company is “selling”, via equity compensation to employees.
The "meaning" is clear, create FOMO among suckers.
For someone who doesn't know what a gigawat worth of Nvidia systems is, how many high-end H100 or whatever does this get you? My estimates along with some poor-grade GPT research leads me to think it could be nearly 10 million? That does seem insane.
It's a ridiculous amount claimed for sure. If its 2 kW per it's around 5 million, and 1 to 2 kW is definitely the right ballpark at a system level.
The NVL72 is 72 chips is 120 kW total for the rack. If you throw in ~25 kW for cooling its pretty much exactly 2 kW each.
So each 2KW component is like a top-shelf space heater which the smart money never did want to run unless it was quite cold outside.
It will be the world's most advanced resistor.
Safely in "millions of devices." The exact number depends on assumptions you make regarding all the supporting stuff, because typically the accelerators consume only a fraction of total power requirement. Even so, millions.
"GPUs per user" would be an interesting metric.
(Quick, inaccurate googling) says there will be "well over 1 million GPUs" by end of the year. With ~800 million users, that's 1 NVIDIA GPU per 800 people. If you estimate people are actively using ChatGPT 5% of the day (1.2 hours a day), you could say there's 1 GPU per 40 people in active use. Assuming consistent and even usage patterns.
That back of the envelope math isn't accurate, but interesting in the context of understanding just how much compute ChatGPT requires to operate.
Edit: I asked ChatGPT how many GPUs per user, and it spit out a bunch of calculations that estimates 1 GPU per ~3 concurrent users. Would love to see a more thorough/accurate break down.
A lot of GPUs are allocated for training and research, so dividing the total number by the number of users isn’t particularly useful. Doubly so if you’re trying to account for concurrency.
I'm kinda scared of "1.2 hours a day of ai use"...
Sorry, those figures are skewed by Timelord Georg, who has been using AI for 100 million hours a day, is an outlier, and should have been removed.
Roger, but I still think with that much energy at its disposal, if AI performs as desired it will work it's way up to using each person more than 1.2 hours per day, without them even knowing about it :\
When GPUs share people concurrently, they collectively get much more than 24 hours of person per day.
You're right!
With that kind of singularity the man-month will no longer be mythical ;)
It will be epic!
At this scale, I would suggest that these numbers are for the entire data center rather than a sum of the processor demands. Also the "infrastructure partnership " language suggest more than just compute. So I would add cooling into the equation, which could be as much a half the power load, or more depending on where they intend to locate these datacenters.
Before reading your comment I did some napkin math using 600W per GPU: 10,000,000,000 / 600 = 16,666,666.66...
With varying consumption/TDP, could be significantly more, could be significantly less, but at least it gives a starting figure. This doesn't account for overhead like energy losses, burst/nominal/sustained, system overhead, and heat removal.
It's easy to see what it could be by looking at Green500.
B200 is 1kW+ TDP ;)
And consists of 8 GPUs
Account for around 3MW for every 1000 GPUs. So, 10GW is around 333 * 10 * 3MW so 3.33 * 1k * 1k GPUs, so around 3.33 M GPUs
How much cable (and what kind) to connect them all? That number would be 100x the number of gpus. I would think they just clip on metal racks no cables but then I saw the xai data center that can blue wire cables everywhere
and How much is that in terms of percentage of bitcoin network capacity ?
Bitcoin mining consumes about 25 GW: https://ccaf.io/cbnsi/cbeci so this single deal amounts to about 40% of that.
To be clear, I am comparing power consumption only. In terms of mining power, all these GPUs could only mine a negligible fraction of what all specialized Bitcoin ASIC mine.
Edit: some math I did out of sheer curiosity: a modern top-of-the-line GPU would mine BTC at about 10 Ghash/s (I don't think anyone tried but I wrote GPU mining software back in the day, and that is my estimate). Nvidia is on track to sell 50 million GPUs in 2025. If they were all mining, their combined compute power would be 500 Phash/s, which is 0.05% of Bitcoin's global mining capacity.
I'm also wondering what kind of threat this could be to PoW blockchains.
What happens if AI doesn't pay off before the GPUs wear out or are in need of replacement?
So at that point a DC replaces them all with ASICs instead?
Or if they just feel like doing that any time.
Some chains are designed to be ASIC resistant.
Vera Rubin will be about 2.5kw and Feynman will be about 4kw.
All-in, you’re looking at a higher footprint maybe 4-5kw per GPU blended.
So about 2 million GPUs.
You could think of it as "as much power as is used by NYC and Chicago combined". Which is fucking insanely wasteful.
I dunno.
Google is pretty useful.
It uses >15 TWh per year.
Theoretically, AI could be more useful than that.
Theoretically, in the future, it could be the same amount of useful (or much more) with substantially less power usage.
It could be a short-term crunch to pull-forward (slightly) AI advancements.
Additionally, I'm extremely skeptical they'll actually turn on this many chips using that much energy globally in a reasonable time-frame.
Saying that you're going to make that kind of investment is one thing. Actually getting the power for it is easier said than done.
VC "valuations" are already a joke. They're more like minimum valuations. If OpenAI is worth anywhere near it's current "valuations", Nvidia would be criminally negligent NOT to invest at a 90% discount (the marginal profit on their chips).
According to Google's latest environmental report[1] that number was 30 TWh per year in 2024, but as far as I can tell that's their total consumption of their datacenters, which would include everything from Google Search, to Gmail, Youtube, to every Google Cloud customer. Is it broken down by product somewhere?
30 TWh per year is equivalent to an average power consumption of 3.4 GW for everything Google does. This partnership is 3x more energy intensive.
Ultimately the difference in `real value/MWh` between these two must be many orders of magnitude.
[1] https://sustainability.google/reports/google-2025-environmen...
Data centers typically use 60% (or less) on average of their max rating.
You over-provision so that you (almost) always have enough compute to meet your customers needs (even at planet scale, your demand is bursty), you're always doing maintenance on some section, spinning up new hardware and turning down old hardware.
So, apples to apples, this would likely not even be 2x at 30TWh for Google.
For other readers: "15 Twh per year" is equivalent to 1.71 GW, 17.1% of the "10GW" number used to describe the deal.
This is ignoring the utilization factor though. Both Google and OpenAI have to overprovision servers for the worst case simultaneous users. So 1.71 GW average doesn't tell use the maximum instantaneous GW capacity of Google -- if we pull a 4x out of the hat (i.e. peak usage is 4x above average), it becomes ~7 GW of available compute.
More than a "Google" of new compute is of course still a lot, but it's not many Googles' worth.
Does Google not include AI?
I mean if 10GW of GPUs gets us AGI and we cure cancer than that's cool, but I do get the feeling we're just getting uncannier chatbots and fully automated tiktok influencers
Current llms are just like farms. Instead of tomatoes by the pound you buy tokens by the pound. So it depends on the customers.
This is also my take. I think a lot of people miss the trees for the forest (intentionally backward).
AI that could find a cure for cancer isn't the driving economic factor in LLM expansion, I don't think. I doubt cancer researchers are holding their breath on this.
And when it’s built, Sam Altman will say: We are so close, if we get 10TW, AGI will be here next year!
Do you think the existence of NYC and Chicago is insanely wasteful?