Comments

  • By ComputerGuru 2025-09-2219:2521 reply

    Framing it in gigawatts is very interesting given the controversy about skyrocketing electric prices for residential and small business users as a result of datacenters over the past three years, primarily driven by AI growth. If, as another commenter notes, this 10GW is how much Chicago and NYC use combined, then we need to have a serious discussion about where this power is going to come from given the dismal status of the USA's power grid and related infrastructure and the already exploding costs that have been shifted to residential users in order to guarantee electric supply to these biggest datacenters (so they can keep paying peanuts for electricity and avoid shouldering any of the infrastructural burden to maintain or improve the underlying grid/plants required to meet their massive power needs).

    I'm not even anti-datacenter (wouldn't be here if I were), I just think there needs to be serious rebalancing of these costs because this increase in US residential electric prices in just five years (from 13¢ to 19¢, a ridiculous 46% increase) is neither fair nor sustainable.

    So where is this 10GW electric supply going to come from and who is going to pay for it?

    Source: https://fred.stlouisfed.org/series/APU000072610

    EDIT:

    To everyone arguing this is how DCs are normally sized: yes, but normally it's not the company providing the compute for the DC owner that is giving these numbers. nVidia doesn't sell empty datacenters with power distribution networks, cooling, and little else; nVidia sells the GPUs that will stock that DC. This isn't a typical PR netnewswire bulletin "OpenAI announces new 10GW datacenter", this is "nvidia is providing xx compute for OpenAI". Anyway, all this is a segue from the question of power supply, consumption, grid expansion/stability, and who is paying for all that.

    • By elbasti 2025-09-2219:529 reply

      I work in the datacenter space. The power consumption of a data center is the "canonical" way to describe their size.

      Almost every component in a datacenter is upgradeable—in fact, the compute itself only has a lifespan of ~5 years—but the power requirements are basically locked-in. A 200MW data center will always be a 200MW data center, even though the flops it computes will increase.

      The fact that we use this unit really nails the fact that AI is basically refining energy.

      • By aurareturn 2025-09-2220:371 reply

          A 200MW data center will always be a 200MW data center, even though the flops it computes will increase.
        
        This here underscores how important TSMC's upcoming N2 node is. It only increases chip density by ~1.15x (very small relative to previous nodes advancements) but it uses 36% less energy at the same speed as N3 or 18% faster than N3 at the same energy. It's coming at the right time for AI chips used by consumers and energy starved data centers.

        N2 is shaping up to be TSMC's most important node since N7.

        • By alberth 2025-09-230:371 reply

          > N2 is shaping up to be TSMC's most important node since N7

          Is it?

          N2, from an energy & perf improvement seems on par with any generation node update.

                    N2:N3   N3:N5  N5:N7
            Power   ~30%    ~30%    ~30%
            Perf    ~15%    ~15%    ~15%
          
          https://www.tomshardware.com/news/tsmc-reveals-2nm-fabricati...

          • By aurareturn 2025-09-235:581 reply

            Yes. It has more tape outs at this stage of development than both N5 or N3. It’s wildly popular for chip designers it seems.

            • By alberth 2025-09-2311:181 reply

              I thought Apple gets exclusive access to the latest node for the first 1-2 years. Is that not the case?

              • By aurareturn 2025-09-2315:101 reply

                No. That's not the case. Maybe for a few months only.

                • By alberth 2025-09-2319:361 reply

                  Correct me if I'm wrong but didn't TSMC launch N3 in 2022, and still only Apple uses this latest/smallest node.

                  Both AMD and NVIDIA are using N4.

                  • By aurareturn 2025-09-253:52

                    Apple, Mediatek, Qualcomm, Intel

      • By pseudosavant 2025-09-2220:001 reply

        I love that term "refining energy". We need to plan for massive growth in electricity production to have the supply to refine.

        • By tmalsburg2 2025-09-233:211 reply

          Sounds smart but it’s abusing the semantics of “refine” and is therefore ultimately vacuous.

          • By pseudosavant 2025-09-2315:30

            I think it is really just the difference between chemically refining something and electrically refining something.

            Raw AC comes in, then gets stepped down, filtered, converted into DC rails, gated, timed, and pulsed. That’s already an industrial refinement process. The "crude" incoming power is shaped into the precise, stable forms that CPUs, GPUs, RAM, storage, and networking can actually use.

            Then those stable voltages get flipped billions of times per second into ordered states, which become instructions, models, inferences, and other high-value "product."

            It sure seems like series of processes for refining something.

      • By jacquesm 2025-09-2221:411 reply

        It is the opposite of refining energy. Electrical energy is steak, what leaves the datacenter is heat, the lowest form of energy that we might still have a use for in that concentration (but most likely we are just dumping it in the atmosphere).

        Refining is taking a lower quality energy source and turning it into a higher quality one.

        What you could argue is that it adds value to bits. But the bits themselves, their state is what matters, not the energy that transports them.

        • By elbasti 2025-09-2222:061 reply

          I think you're pushing the metaphor a bit far, but the parallel was to something like ore.

          A power plant "mines" electron, which the data center then refines into words. or whatever. The point is that energy is the raw material that flows into data centers.

          • By fuzzfactor 2025-09-2222:501 reply

            Maybe more like converting energy to data, as a more specific type of refinement.

            • By phkahler 2025-09-230:192 reply

              Using energy to decrease the entropy of data. Or to organize and structure data.

              • By fuzzfactor 2025-09-231:58

                I like that. Take random wild electrons and put them neatly into rows & columns where they can sit a spell.

              • By LaGrange 2025-09-2310:47

                This is OpenAI, they are not decreasing the entropy. This is refining coal into waste heat and CO2.

      • By reubenmorais 2025-09-2220:103 reply

        All life is basically refining energy - standing up to entropy and temporarily winning the fight.

        • By HPsquared 2025-09-2220:581 reply

          It's all about putting the entropy somewhere else and keeping your own little area organised.

          • By xnickb 2025-09-2221:08

            People of the earth, remember: unnecessary arm and leg movements increase the entropy! Fear of the heat death of the universe! Lie down when possible!

        • By antihipocrat 2025-09-230:45

          Yes, in a very local context it appears so, but net entropy across the system from life's activities is increased

        • By ithkuil 2025-09-235:36

          "the purpose of life is to hydrogenate carbon dioxide"

          -- Michael Russel

      • By casey2 2025-09-2310:26

        Where do the cards go after 5 years? I don't see a large surplus of mid sized cloud providers coming to buy them (cause AI isn't profitable), Maybe other countries (possibly illegally)? Flood the consumer market with cards they can't use? TSMCs' more than doubled packaging and they are planning on doubling again

      • By protocolture 2025-09-2223:121 reply

        This.

        A local to me ~40W datacenter used to be in really high demand, and despite having excess rack space, had no excess power. It was crazy.

        • By nixass 2025-09-2223:141 reply

          40W - is that ant datacenter? :)

          • By protocolture 2025-09-230:211 reply

            Yeah, it was the companies pilot site, and everything about it is tiny.

            But it very quickly became the best place in town for carrier interconnection. So every carrier wanted in.

            Even when bigger local DC's went in, a lot of what they were doing was just landing virtual cross connects to the tiny one, because thats where everyone was.

      • By pabs3 2025-09-233:433 reply

        > the power requirements are basically locked-in

        Why is that? To do with the incoming power feed or something else?

        • By brendoelfrendo 2025-09-234:011 reply

          Basically, yes. When you stand up something that big, you need to work with the local utilities to ensure they have the capacity for what you're doing. While you can ask for more power later on, if the utilities can't supply it or the grid can't transport it, you're SOL.

          • By pabs3 2025-09-238:302 reply

            You could in theory supplement it with rooftop solar and batteries, especially if you can get customers who can curtail their energy use easily. Datacentres have a lot of roof space, they could at least reduce their daytime energy costs a bit. I wonder why you don't see many doing solar, do the economics not work out yet?

            • By brendoelfrendo 2025-09-2316:25

              I'd have to do the math, but I doubt that makes sense given the amount of power these things are drawing. I've heard of DCs having on-site power generation, but it's usually in the form of diesel generators used for supplemental or emergency power. In one weird case, I heard about a DC that used on-site diesel as primary power and used the grid as backup.

            • By XorNot 2025-09-239:55

              Compared to their volume they absolutely do not: you get about ~1kW / m^2 of solar. Some quick googling suggests a typical DC workload would be about 50 kW / m^2, rising too 100 for AI workloads.

        • By jl6 2025-09-236:38

          Cooling too. A datacenter that takes 200MW in has to dissipate 200MW of heat to somewhere.

        • By djtriptych 2025-09-233:57

          guessing massive capital outlays and maybe irreversible site selection/preparation concerns.

      • By kulahan 2025-09-2221:391 reply

        That's pretty interesting. Is it just because the power channels are the most fundamental aspect of the building? I'm sorta surprised you can't rip out old cables and drop in new ones, or something to that effect, but I also know NOTHING about electricity.

        • By libraryofbabel 2025-09-233:221 reply

          Not an expert, but it’s probably related to cooling. Every joule of that electricity that goes in must also leave the datacenter as heat. And the whole design of a datacenter is centered around cooling requirements.

          • By vrighter 2025-09-2316:14

            Exactly. To add to that, I'd like to point out that when this person says every joule, he is not exaggerating (only a teeny tiny bit). The actual computation itself barely uses any energy at all.

      • By pjc50 2025-09-2310:46

        Refining it into what? Stock prices?

    • By deelowe 2025-09-2219:46

      DC infra is always allocated in terms of watts. From this number, everything else is extrapolated (e.g. rough IT load, cooling needed, etc).

    • By epolanski 2025-09-230:363 reply

      > is neither fair nor sustainable

      That's half what I pay in Italy, I'm sure the richest country in the world will do fine.

      • By FirmwareBurner 2025-09-237:561 reply

        >I'm sure the richest country in the world will do fine.

        You underestimate how addicted the US is to cheap energy and how wasteful it is at the same time.

        Remember how your lifestyle always expands to fill the available resources no matter how good you have it? Well if tomorrow they'd have to pay EU prices, the country would have a war.

        When you lived your entire life not caring about the energy bill or about saving energy, it's crippling to suddenly have scale back and be frugal even if that price would still be less than what other countries pay.

        • By impjohn 2025-09-2415:42

          It's hard to appreciate the difference in 'abundance mentality' between the median US and EU person. It always struck me as an interesting culture difference. While both EU and US grew in prosperity post WWII, I feel the US narrative was quite on another level.

      • By modo_mario 2025-09-239:162 reply

        Here in Belgium a stupid amount of that bill is hidden taxes. i kind of assume it's similar in Italy.

        • By epolanski 2025-09-239:58

          We import most of our energy, that's really it.

        • By port11 2025-09-2312:55

          And the substantial increase in profits for all providers, which isn't comparable to that of our neighbours. Our disposable income in Belgium really exists to subsidise energy companies, supermarkets, and a pathetic housing market.

    • By abstractwater 2025-09-2221:17

      > So where is this 10GW electric supply going to come from and who is going to pay for it?

      I would also like to know. It's a LOT of power to supply. Nvidia does have a ~3% stake in Applied Digital, a bitcoin miner that pivoted to AI (also a "Preferred NVIDIA Cloud Partner") with facilities in North Dakota. So they might be involved for a fraction of those 10GW, but it seems like it will be a small fraction even with all the planned expansions.

      https://www.investopedia.com/applied-digital-stock-soars-on-...

      https://ir.applieddigital.com/news-events/press-releases/det...

    • By gitpusher 2025-09-2220:051 reply

      > Framing it in gigawatts is very interesting given the controversy

      Exactly. When I saw the headline I assumed it would contain some sort of ambitious green energy build-out, or at least a commitment to acquire X% of the energy from renewable sources. That's the only reason I can think to brag about energy consumption

      • By 7952 2025-09-2221:07

        Or this brings power and prestige to the country that hosts it. And it gives clout precisely because it is seemingly wasteful. Finding the energy is a problem for the civilian government who either go "drill baby drill" or throw wind/solar/nuclear at the problem.

    • By paulsutter 2025-09-2220:59

      Datacenters need to provide their own power/storage, and connect to the grid just to trade excess energy or provide grid stability. Given the 5-7 year backlog of photovoltaic projects waiting for interconnect, the grid is kind of a dinosaur that needs to be routed around

    • By mensetmanusman 2025-09-2220:32

      “ skyrocketing electric prices for residential and small business users as a result of datacenters over the past three years”

      This is probably naïve. Prices skyrocketed in Germany for similar reasons before AI data centers were a thing.

    • By paxys 2025-09-2220:271 reply

      Watt is the hottest new currency in big tech. Want to launch something big? You don't have to ask for dollars or headcount or servers or whatever else used to be the bottleneck in the past. There's plenty of all this to go around (and if not it can be easily bought). Success or failure now depends on whether you can beg and plead your way to getting a large enough kilowatt/megawatt allocation over every other team that's fighting for it. Everything is measured this way.

    • By apercu 2025-09-2313:02

      I had my highest power bill last month in 4 years, in a month that was unseasonably cool so no AC for most of the month. Why are we as citizens without equity in these businesses subsidizing the capital class?

    • By ianks 2025-09-231:381 reply

      To me, the question is less about “how do we make more energy” and more about “how do we make LLMs 100x more energy efficient.” Not saying this is an easy problem to solve, but it all seems like a stinky code smell.

      • By sothatsit 2025-09-233:12

        I'm pretty confident that if LLMs were made 100x more energy efficient, we would just build bigger LLMs or run more parallel inference. OpenAI's GPT-5 Pro could become the baseline, and their crazy expensive IMO model could become the Pro offering. Especially if that energy efficiency came with speedups as well (I would be surprised if it didn't). The demand for smarter models seems very strong.

    • By XorNot 2025-09-239:35

      This feels like a return to the moment just before Deepseek when the market was feeling all fat and confident that "more GPUs == MOAR AI". They don't understand the science, so they really want a simple figure to point to that means "this is the winner".

      Framing it in GW is just giving them what they want, even if it makes no sense.

    • By apimade 2025-09-2221:513 reply

      An 8% increase y/o/y is quite substantial, however keep in mind globally we experienced the 2022 fuel shock. In Australia for example we saw energy prices double that year.

      Although wholesale electricity prices show double-digit average year-on-year swings, their true long-run growth is closer to ~6% per year, slightly above wages at ~4% during the same period.

      So power has become somewhat less affordable, but still remains a small share of household income. In other words, wage growth has absorbed much of the real impact, and power prices are still a fraction of household income.

      You can make it sound shocking with statements like “In 1999, a household’s wholesale power cost was about $150 a year, in 2022, that same household would be charged more than $1,000, even as wages only grew 2.5x”, but the real impact (on average, obviously there are outliers and low income households are disproportionately impacted in areas where gov doesn’t subsidise) isn’t major.

      https://www.aer.gov.au/industry/registers/charts/annual-volu...

      • By atkailash 2025-09-2222:38

        I wouldn’t call a $100-270 electric bill a “fraction” when it’s about 5% post tax income. I use a single light on a timer and have a small apartment

        Especially since these sorts of corporations can get tax breaks or har means of getting regulators to allow spreading the cost. Residential shouldn’t see any increase due to data centers, but they do, and will, supplement them while seeing minimal changes to infrastructure

        When people are being told to minimize air conditioning but then these big datacenters are made and aren’t told “reduce your consumption” then it doesn’t matter how big or small the electric bill is, it’s supplementing a multi billion dollar corporation’s toy

      • By bushbaba 2025-09-233:21

        6% YoY is much higher than the 2-3% inflation target

      • By richrichardsson 2025-09-236:52

        So a 6.6x increase in power bill, offset by a 2.5x wage increase has no major impact?

        I'm sure none of the other outgoings for a household saw similar increases. /s

    • By mvanbaak 2025-09-2222:303 reply

      0,19 per kwh. Damn man, here it is like 0,97 per kwh (Western Europe) … stop complaining

      • By Rexxar 2025-09-230:01

        Regulated price in France:

        - 0,1952 per kWh for uniform price.

        - 0,1635 / 0,2081 for day/nigh pricing

        - 0,1232 /... / 0,6468 for variable pricing

        https://particulier.edf.fr/content/dam/2-Actifs/Documents/Of...

        You have a very bad deal if you pay 0.97€ per kWh.

      • By patrickmcnamara 2025-09-2223:071 reply

        This is not true. The average in the EU is 0,287 €/kWh. I pay 0,34 €/kWh in Berlin.

        • By distances 2025-09-238:54

          And in Germany the price includes transmission and taxes, it's the consumer end price. You have to remember that some countries report electricity price without transmission or taxes, also in consumer context, so you need to be careful with comparisons.

    • By whatever1 2025-09-2221:371 reply

      DCs need to align their training cycles with the peak of renewable power generation

      • By justincormack 2025-09-2221:47

        They are starting to include batteries so they dont have to adjust to external factors

    • By cavisne 2025-09-2222:04

      Utilities always need to justify rate increases with the regulator.

      The bulk of cost increases come from the transition to renewable energy. You can check your local utility and see.

      It’s very easy to make a huge customer like a data center directly pay the cost needed to serve them from the grid.

      Generation of electricity is more complicated, the data centers pulling cheap power from Colombia river hydro are starting to compete with residential users.

      Generation is a tiny fraction of electricity charges though.

    • By stogot 2025-09-2219:411 reply

      I actually don’t like this measurement, as it’s vague and dilutes the announcement. Each product has a different efficiency of watts.

      Imagine Ford announced “a strategic partnership with FedEx to deploy 10 giga-gallons of ICE vehicles”

      • By mensetmanusman 2025-09-2220:32

        It’s a sticky metric though because Moores law per power consumption died years ago.

    • By dantillberg 2025-09-2221:281 reply

      Prices of _everything_ went up over the past five years. Datacenter expansion was far from the main driver. Dollars and cents aren't worth what they used to be.

      • By basilgohar 2025-09-2221:421 reply

        Elsewhere it was mentioned that DCs pay less for electricity per Wh than residential customers. If that is the case, then it's not just about inflation, but also unfair pricing putting more of the infrastructure costs on residential customers whereas the demand increase is coming from commercial ones.

        • By aaronmdjones 2025-09-236:13

          Industrial electricity consumers pay lower unit rates per kWh, but they also pay for any reactive power that they consume and then return -- residential consumers do not. As in, what industrial consumers actually pay is a unit cost per kVAh, not kWh.

          This means loads with pretty abysmal power factors (like induction motors) actually end up costing the business more money than if they ran them at home (assuming the home had a sufficient supply of power).

          Further, they get these lower rates in exchange for being deprioritised -- in grid instability (e.g. an ongoing frequency decline because demand outstrips available supply), they will be the first consumers to be disconnected from the grid. Rolling blackouts affecting residential consumers are the last resort.

          There are two sides to this coin.

          Note that I am in no way siding with this whole AI electricity consumption disaster. I can't wait for this bubble to pop so we can get back to normality. 10GW is a third of the entire daily peak demand of my country (the United Kingdom). It's ridiculous.

          Edit: Practical Engineering (YouTube channel) has a pretty decent video on the subject. https://www.youtube.com/watch?v=ZwkNTwWJP5k

    • By randomNumber7 2025-09-2219:421 reply

      I mean gigawatts is a concise metric to get a grasp of the amount of gpu compute they install, but the honesty seems a bit strange to me imo.

      • By fuzzfactor 2025-09-2223:02

        Total gigawatts is the maximum amount of power that can be supplied from the power generating station and consumed at the DC through the infrastructure and hardware as it was built.

        Whether they use all those gigawatts and what they use them for would be considered optional and variable from time to time.

    • By mullingitover 2025-09-2222:363 reply

      > So where is this 10GW electric supply going to come from

      If the US petro-regime wasn't fighting against cheap energy sources this would be a rounding error in the country's solar deployment.

      China deployed 277GW of solar in 2024 and is accelerating, having deployed 212GW in the first half of 2025. 10 GW could be a pebble in the road, but instead it will be a boulder.

      Voters should be livid that their power bills are going up instead of plummeting.

      • By Saline9515 2025-09-237:221 reply

        Fyi capacity announced is very far from the real capacity when dealing with renewables. It's like saying that you bought a Ferrari so now you can drive at 300km/h on the road all of the time.

        In mid latitudes, 1 GW of solar power produces around 5.5 GWh/day. So the "real" equivalent is a 0.23 GW gas or nuclear plant (even lower when accounting for storage losses).

        But "China installed 63 GW-equivalent" of solar power is a bit less interesting, so we go for the fake figures ;-)

        • By FooBarWidget 2025-09-239:301 reply

          You think they don't know that too? You can bet they're investing heavily in grid-level storage too.

          • By Saline9515 2025-09-2311:31

            I was commenting the initial number announcement. And storage at this scale right now doesn't exist. The most common way, water reservoirs, requires hard-to-find sites that are typically in the Himalaya, so far away from the production place. And the environmental cost isn't pretty either.

      • By parineum 2025-09-235:53

        I'm living in one of the most expensive electricity markets in the US. It has a lot more to do with the state shutting down cheap petro energy (natural gas) and nuclear then replacing it with... tbd.

      • By bushbaba 2025-09-233:221 reply

        How would that solar power a DC at night or on a cloudy day? Energy storage isn’t cheap.

        • By mullingitover 2025-09-235:051 reply

          In 2025 it’s cheaper to demolish an operating coal plant and replace it with solar and battery, and prices are still dropping.

          • By parineum 2025-09-235:55

            Why aren't all these businesses doing that then?

    • By p1necone 2025-09-2222:505 reply

      Theoretically couldn't you use all the waste heat from the data center to generate electricity again, making the "actual" consumption of the data center much lower?

      • By quasse 2025-09-2222:57

        Given that steam turbine efficiency depends on the temperature delta between steam input and condenser, unlikely unless you're somehow going to adapt Nvidia GPUs to run with cooling loop water at 250C+.

      • By pjc50 2025-09-2310:48

        Thermodynamics says no. In fact you have to spend energy to remove that heat from the cores.

        (Things might be different if you had some sort of SiC process that let you run a GPU at 500C core temperatures, then you could start thinking of meaningful uses for that, but you'd still need a river or sea for the cool side just as you do for nuclear plants)

      • By distances 2025-09-238:52

        In the Nordics the waste heat is used for district heating. This practical heat sink really favors northern countries for datacenter builds. In addition you usually get abundant water and lower population density (meaning easier to build renewables that have excess capacity).

      • By Blackthorn 2025-09-2222:52

        No.

  • By paxys 2025-09-2217:402 reply

    > letter of intent for a landmark strategic partnership

    > intends to invest up to xxx progressively

    > preferred strategic compute and networking partner

    > work together to co-optimize their roadmaps

    > look forward to finalizing the details of this new phase of strategic partnership

    I don't think I have seen so much meaningless corporate speak and so many outs in a public statement. "Yeah we'll maybe eventually do something cool".

    • By BHSPitMonkey 2025-09-2218:351 reply

      NVDA's share price enjoyed a nice $6 bump today, so the announcement did what it was supposed to do.

      In a sense, it's just an ask to public investors for added capital to do a thing, and evidently a number of investors found the pitch compelling enough.

      • By paxys 2025-09-2218:463 reply

        Increase in share price doesn't provide extra cash to a company. They'd have to issue new shares for that.

        • By jedberg 2025-09-2219:06

          It doesn't directly, but it helps because they can do deals where they buy things with stock, like people's labor or small companies, and now that "money" is more valuable.

        • By onesociety2022 2025-09-2219:522 reply

          It does help with employee stock compensation. If your stock doubled in the past year, then you just need to dole out 50% of shares as last year in equity refreshers to retain talent.

          • By paxys 2025-09-2220:221 reply

            Nvidia probably has the opposite problem - employee stock has appreciated so much that you have to convince them not to retire.

            • By onesociety2022 2025-09-2221:06

              Maybe but people's spending also dramatically goes up as they start making more money. You buy that $5m vacation home at Tahoe, you buy fully-loaded Rivian SUVs, you send your kids to expensive private schools, you fly only first-class on family vacations, and you are back to needing to work more to sustain this lifestyle.

          • By nenenejej 2025-09-2219:55

            This assumes your staff are not a bunch of boglehead freaks constantly on blind and crunching spreadsheets and grinding their leetcode for that perfectly timed leap.

            RSU vesting is a bit like options. You have the option but not the obligation to stay in the job!

        • By hshshshshsh 2025-09-2219:001 reply

          But company owns stock right? So they can sell those stocks no?

          • By paxys 2025-09-2219:222 reply

            It can, but investors don't like that since it dilutes the value of their own shares. Which is why large companies usually do the opposite - share buybacks. Nvidia in fact bought $24 billion worth of its own shares in the first half of 2025, and plans to spend $60 billion more in buybacks in upcoming months.

            • By littlecranky67 2025-09-2222:451 reply

              Which investors also usually don't like. It says "we have all this cash, but we have no idea what to do with it so we are buying out own stock". While I'd expect a company to actually invest (into research, tech, growth etc.) with it's excess cash to make more money in the future.

              • By inemesitaffia 2025-09-230:521 reply

                Preferred by some to dividends.

                • By roland35 2025-09-231:05

                  If stock buybacks cause the price to go up like it should in theory, that's less of a tax hit than dividends! I'll take it

            • By lotsofpulp 2025-09-2219:52

              That has to be compared with how much stock the company is “selling”, via equity compensation to employees.

    • By casey2 2025-09-2310:36

      The "meaning" is clear, create FOMO among suckers.

  • By ddtaylor 2025-09-2216:3411 reply

    For someone who doesn't know what a gigawat worth of Nvidia systems is, how many high-end H100 or whatever does this get you? My estimates along with some poor-grade GPT research leads me to think it could be nearly 10 million? That does seem insane.

    • By kingstnap 2025-09-2217:091 reply

      It's a ridiculous amount claimed for sure. If its 2 kW per it's around 5 million, and 1 to 2 kW is definitely the right ballpark at a system level.

      The NVL72 is 72 chips is 120 kW total for the rack. If you throw in ~25 kW for cooling its pretty much exactly 2 kW each.

      • By fuzzfactor 2025-09-2223:181 reply

        So each 2KW component is like a top-shelf space heater which the smart money never did want to run unless it was quite cold outside.

        • By willis936 2025-09-233:05

          It will be the world's most advanced resistor.

    • By thrtythreeforty 2025-09-2216:391 reply

      Safely in "millions of devices." The exact number depends on assumptions you make regarding all the supporting stuff, because typically the accelerators consume only a fraction of total power requirement. Even so, millions.

      • By cj 2025-09-2217:132 reply

        "GPUs per user" would be an interesting metric.

        (Quick, inaccurate googling) says there will be "well over 1 million GPUs" by end of the year. With ~800 million users, that's 1 NVIDIA GPU per 800 people. If you estimate people are actively using ChatGPT 5% of the day (1.2 hours a day), you could say there's 1 GPU per 40 people in active use. Assuming consistent and even usage patterns.

        That back of the envelope math isn't accurate, but interesting in the context of understanding just how much compute ChatGPT requires to operate.

        Edit: I asked ChatGPT how many GPUs per user, and it spit out a bunch of calculations that estimates 1 GPU per ~3 concurrent users. Would love to see a more thorough/accurate break down.

        • By coder543 2025-09-2220:18

          A lot of GPUs are allocated for training and research, so dividing the total number by the number of users isn’t particularly useful. Doubly so if you’re trying to account for concurrency.

        • By NooneAtAll3 2025-09-2219:141 reply

          I'm kinda scared of "1.2 hours a day of ai use"...

          • By Rudybega 2025-09-2219:511 reply

            Sorry, those figures are skewed by Timelord Georg, who has been using AI for 100 million hours a day, is an outlier, and should have been removed.

            • By fuzzfactor 2025-09-2223:141 reply

              Roger, but I still think with that much energy at its disposal, if AI performs as desired it will work it's way up to using each person more than 1.2 hours per day, without them even knowing about it :\

              • By Nevermark 2025-09-233:421 reply

                When GPUs share people concurrently, they collectively get much more than 24 hours of person per day.

                • By fuzzfactor 2025-09-233:581 reply

                  You're right!

                  With that kind of singularity the man-month will no longer be mythical ;)

    • By sandworm101 2025-09-2217:33

      At this scale, I would suggest that these numbers are for the entire data center rather than a sum of the processor demands. Also the "infrastructure partnership " language suggest more than just compute. So I would add cooling into the equation, which could be as much a half the power load, or more depending on where they intend to locate these datacenters.

    • By skhameneh 2025-09-2216:413 reply

      Before reading your comment I did some napkin math using 600W per GPU: 10,000,000,000 / 600 = 16,666,666.66...

      With varying consumption/TDP, could be significantly more, could be significantly less, but at least it gives a starting figure. This doesn't account for overhead like energy losses, burst/nominal/sustained, system overhead, and heat removal.

    • By alphabetag675 2025-09-2217:41

      Account for around 3MW for every 1000 GPUs. So, 10GW is around 333 * 10 * 3MW so 3.33 * 1k * 1k GPUs, so around 3.33 M GPUs

    • By ProofHouse 2025-09-2217:001 reply

      How much cable (and what kind) to connect them all? That number would be 100x the number of gpus. I would think they just clip on metal racks no cables but then I saw the xai data center that can blue wire cables everywhere

    • By iamgopal 2025-09-2216:532 reply

      and How much is that in terms of percentage of bitcoin network capacity ?

      • By mrb 2025-09-2217:16

        Bitcoin mining consumes about 25 GW: https://ccaf.io/cbnsi/cbeci so this single deal amounts to about 40% of that.

        To be clear, I am comparing power consumption only. In terms of mining power, all these GPUs could only mine a negligible fraction of what all specialized Bitcoin ASIC mine.

        Edit: some math I did out of sheer curiosity: a modern top-of-the-line GPU would mine BTC at about 10 Ghash/s (I don't think anyone tried but I wrote GPU mining software back in the day, and that is my estimate). Nvidia is on track to sell 50 million GPUs in 2025. If they were all mining, their combined compute power would be 500 Phash/s, which is 0.05% of Bitcoin's global mining capacity.

      • By cedws 2025-09-2217:071 reply

        I'm also wondering what kind of threat this could be to PoW blockchains.

        • By typpilol 2025-09-2221:322 reply

          Literally none at all because asic

          • By fuzzfactor 2025-09-2223:32

            What happens if AI doesn't pay off before the GPUs wear out or are in need of replacement?

            So at that point a DC replaces them all with ASICs instead?

            Or if they just feel like doing that any time.

          • By cedws 2025-09-2311:26

            Some chains are designed to be ASIC resistant.

    • By az226 2025-09-239:50

      Vera Rubin will be about 2.5kw and Feynman will be about 4kw.

      All-in, you’re looking at a higher footprint maybe 4-5kw per GPU blended.

      So about 2 million GPUs.

    • By awertjlkjl 2025-09-2217:384 reply

      You could think of it as "as much power as is used by NYC and Chicago combined". Which is fucking insanely wasteful.

      • By onlyrealcuzzo 2025-09-2217:434 reply

        I dunno.

        Google is pretty useful.

        It uses >15 TWh per year.

        Theoretically, AI could be more useful than that.

        Theoretically, in the future, it could be the same amount of useful (or much more) with substantially less power usage.

        It could be a short-term crunch to pull-forward (slightly) AI advancements.

        Additionally, I'm extremely skeptical they'll actually turn on this many chips using that much energy globally in a reasonable time-frame.

        Saying that you're going to make that kind of investment is one thing. Actually getting the power for it is easier said than done.

        VC "valuations" are already a joke. They're more like minimum valuations. If OpenAI is worth anywhere near it's current "valuations", Nvidia would be criminally negligent NOT to invest at a 90% discount (the marginal profit on their chips).

        • By dns_snek 2025-09-2218:001 reply

          According to Google's latest environmental report[1] that number was 30 TWh per year in 2024, but as far as I can tell that's their total consumption of their datacenters, which would include everything from Google Search, to Gmail, Youtube, to every Google Cloud customer. Is it broken down by product somewhere?

          30 TWh per year is equivalent to an average power consumption of 3.4 GW for everything Google does. This partnership is 3x more energy intensive.

          Ultimately the difference in `real value/MWh` between these two must be many orders of magnitude.

          [1] https://sustainability.google/reports/google-2025-environmen...

          • By onlyrealcuzzo 2025-09-2312:32

            Data centers typically use 60% (or less) on average of their max rating.

            You over-provision so that you (almost) always have enough compute to meet your customers needs (even at planet scale, your demand is bursty), you're always doing maintenance on some section, spinning up new hardware and turning down old hardware.

            So, apples to apples, this would likely not even be 2x at 30TWh for Google.

        • By tmiku 2025-09-2218:011 reply

          For other readers: "15 Twh per year" is equivalent to 1.71 GW, 17.1% of the "10GW" number used to describe the deal.

          • By mNovak 2025-09-2219:13

            This is ignoring the utilization factor though. Both Google and OpenAI have to overprovision servers for the worst case simultaneous users. So 1.71 GW average doesn't tell use the maximum instantaneous GW capacity of Google -- if we pull a 4x out of the hat (i.e. peak usage is 4x above average), it becomes ~7 GW of available compute.

            More than a "Google" of new compute is of course still a lot, but it's not many Googles' worth.

        • By Capricorn2481 2025-09-2217:49

          Does Google not include AI?

      • By jazzyjackson 2025-09-2217:483 reply

        I mean if 10GW of GPUs gets us AGI and we cure cancer than that's cool, but I do get the feeling we're just getting uncannier chatbots and fully automated tiktok influencers

        • By yard2010 2025-09-2218:36

          Current llms are just like farms. Instead of tomatoes by the pound you buy tokens by the pound. So it depends on the customers.

        • By junon 2025-09-2218:21

          This is also my take. I think a lot of people miss the trees for the forest (intentionally backward).

          AI that could find a cure for cancer isn't the driving economic factor in LLM expansion, I don't think. I doubt cancer researchers are holding their breath on this.

        • By rebolek 2025-09-2221:22

          And when it’s built, Sam Altman will say: We are so close, if we get 10TW, AGI will be here next year!

      • By diego_sandoval 2025-09-2219:37

        Do you think the existence of NYC and Chicago is insanely wasteful?

HackerNews