Covering electricity price increases from our data centers

2026-02-1121:12127107www.anthropic.com

Anthropic is an AI safety and research company that's working to build reliable, interpretable, and steerable AI systems.

As we continue to invest in American AI infrastructure, Anthropic will cover electricity price increases that consumers face from our data centers.

Training a single frontier AI model will soon require gigawatts of power, and the US AI sector will need at least 50 gigawatts of capacity over the next several years. The country needs to build new data centers quickly to maintain its competitiveness on AI and national security—but AI companies shouldn’t leave American ratepayers to pick up the tab.

Data centers can raise consumer electricity prices in two main ways. First, connecting data centers to the grid often requires costly new or upgraded infrastructure like transmission lines or substations. Second, new demand tightens the market, pushing up prices. We’re committing to address both. Specifically, we will:

  • Cover grid infrastructure costs. We will pay for 100% of the grid upgrades needed to interconnect our data centers, paid through increases to our monthly electricity charges. This includes the shares of these costs that would otherwise be passed onto consumers.
  • Procure new power and protect consumers from price increases. We will work to bring net-new power generation online to match our data centers’ electricity needs. Where new generation isn’t online, we’ll work with utilities and external experts to estimate and cover demand-driven price effects from our data centers.
  • Reduce strain on the grid. We’re investing in curtailment systems that cut our data centers’ power usage during periods of peak demand, as well as grid optimization tools, both of which help keep prices lower for ratepayers.
  • Invest in local communities. Our current data center projects will create hundreds of permanent jobs and thousands of construction jobs. We’re also committed to being a responsible neighbor—that means addressing environmental impacts, including deploying water-efficient cooling technologies, and partnering with local leaders on initiatives that share AI’s benefits broadly.

Where we work with partners to develop data centers for handling our own workloads, we make these commitments directly. Where we lease capacity from existing data centers, we’re exploring further ways to address our own workloads' effects on prices.

Of course, company-level action isn't enough. Keeping electricity affordable also requires systemic change. We support federal policies—including permitting reform and efforts to speed up transmission development and grid interconnection—that make it faster and cheaper to bring new energy online for everyone.

Done right, AI infrastructure can be a catalyst for the broader energy investment the country needs. These commitments are the beginning of our efforts to address data centers’ impact on energy costs. We have more to do, and we’ll continue to share updates as this work develops.


Read the original article

Comments

  • By Aurornis 2026-02-121:526 reply

    > Cover grid infrastructure costs. We will pay for 100% of the grid upgrades needed to interconnect our data centers, paid through increases to our monthly electricity charges. This includes the shares of these costs that would otherwise be passed onto consumers.

    This is great, but do they have an actual example of something that would have been passed on to consumers? Or is it just a hypothetical?

    In the location I’m familiar with, large infrastructure projects have to pay their own interconnection costs. Utilities are diverse across the country so I wouldn’t be surprised if there are differences, but in general I doubt there are many situations where utilities were going to raise consumer’s monthly rates specifically to connect some large commercial infrastructure.

    Maybe someone more familiar with these locations can provide more details, but I think this public promise is rather easy to make.

    • By epistasis 2026-02-123:181 reply

      There's a huge diversity of pricing and regulatory schemes across the US. I think you skepticism is well placed in general, because where I live in California the price increase has been almost entirely from bad grid maintenance policies of years past but people come up with random other excuses.

      However there are some examples where increased demand by one sector leads to higher prices for everyone. The PJM electricity market has a capacity market, where generators get compensated for being able to promise the ability to deliver electricity on demand. When demand goes up, prices increase in the capacity market, and those prices get charged to everyone. In the last auction, prices were sky high, which leads to higher electricity prices for everyone:

      https://www.utilitydive.com/news/pjm-interconnection-capacit...

      A lot of electricity markets in other places allow procurement processes where increased costs to meet demand get passed to all consumers equally. If these places were actually using IRPs that had up to date pricing, adding new capacity from renewables and storage would lower prices, but instead many utilities go with what they know, gas generators, which are in short supply and coming in at very high prices.

      And the cost of the grid is high everywhere. As renewables and storage drive down electricity generation prices, the grid will come to be a larger and larger percentage of electricity costs. Interconnection is just one bit of the cost, transmission needs to be upgraded all around as overall demand grows. We've gone through a few decades of stagnant to lessening electricity demand, and utilities are hungry to do very expensive grid projects because they get a guaranteed rate of return on grid expansion in most parts of the country.

      • By bigbadfeline 2026-02-129:031 reply

        There's only one way to resolve this - datacenters to build their own energy generation, not connected to the grid, just local. Otherwise, they'll muddy the water to no end until they manage to saddle the rest of us with their energy costs.

        • By lotsofpulp 2026-02-1213:441 reply

          The muddied water is just supply and demand. If a datacenter increases demand for electricians, natural gas, solar panels, copper, whatever else, then the price will have to go up. The only thing that can bring prices down in the face of increased demand is increased supply.

          The demand is still there, connected to the grid or not. The grid can help make things more efficient and resilient in some ways (and less resilient in other ways), which is why the grid came about in the first place.

          • By bigbadfeline 2026-02-1220:12

            > The only thing that can bring prices down in the face of increased demand is increased supply.

            That's not the question, we aren't discussing trivialities like what change of supply is necessary for satisfying increased demand, that's like discussing "is water wet" or "do you need more or less water to satisfy your thirst for water".

            The real question is Who is going to pay for building the additional supply?

            Residential and other prior customers have already paid the capex for the existing supply and now you want them to pay the capex for enormous amounts of new capacity which the AI corps convert exclusively into their own revenue.

            The public is already paying through the nose for new semiconductor capacity because the same scam-geniuses cornered the RAM, GPU and related chips market and they are mercilessly scalping it too, again at the expense of the public.

            > The grid can help make things more efficient and resilient in some ways

            In a perfect world it can, in this world it makes things more unstable and far more unfair when large new consumers use it for their exclusive revenue extraction while pretending that the new capacity is somehow benefiting everybody instead of just them.

            > The muddied water is just supply and demand.

            Indeed, "just supply and demand" is the mud in the eyes.

    • By mysterydip 2026-02-121:58

      North Carolina passed Senate Bill 266, changing how utilities can recover costs for projects under construction amid rising energy demand, particularly from data centers. Now Duke Energy wants a double digit price rate increase: https://starw1.ncuc.gov/NCUC/ViewFile.aspx?Id=0ac12377-99be-...

    • By wmf 2026-02-122:56

      Putting aside interconnection costs, when electricity is auctioned increased demand can increase wholesale prices for everyone.

    • By ZeroGravitas 2026-02-1214:13

      Amazon tried to buy an existing nuclear plant's output from a company called Talen for a datacenters colocated with the nuclear plant. They would do a special deal so the electricity they bought wouldn't go via the shared grid.

      It got blocked by FERC as it would raise other consumers' energy prices and the deal wasn't fully transparent (probably intentionally so they could shift costs onto others).

    • By hattmall 2026-02-124:27

      Georgia power already has a demand scaled recovery charge addition to bills that increases prices for residential customers regardless of where the demand originates. It used to be only applied occasionally during the summer. Now they've adjusted the peak / off-peak rates to be what it used to be plus the demand recovery, and now the demand recovery is additional and just applies pretty much all the time.

    • By pvab3 2026-02-122:06

      Generally most distribution costs are socialized starting with the REA and such. My block needed a new transformer a few weeks ago and it will be paid for by every customer of that utility.

  • By unltdpower 2026-02-121:507 reply

    "Committing to buying the glass to replace the window I broke in your shop to rob the place, you're welcome."

    > Training a single frontier AI model will soon require gigawatts of power, and the US AI sector will need at least 50 gigawatts of capacity over the next several years.

    These things are so hideously inefficient. All of you building these things for these people should be embarrassed and ashamed.

    • By keeda 2026-02-123:424 reply

      > These things are so hideously inefficient.

      Quite the opposite, really. I did some napkin math for energy and water consumption, and compared to humans these things are very resource efficient.

      If LLMs improve productivity by even 5% (studies actually peg productivity gains across various professions at 15 - 30%, and these are from 2024!) the resource savings by accelerating all knowledge workers are significant.

      Simplistically, during 8 hours of work a human would consume 10 kWH of electricity + 27 gallons of water. Sped up by 5%, that drops by 0.5kWH and 1.35 gallons. Even assuming a higher end of resources used by LLMs, a 100 large prompts (~1 every 5 minutes) would only consume 0.25 kWH + 0.3 gallons. So we're still saving ~0.25 kWH + 1 gallon overall per day!

      That is, humans + LLMs are way more efficient than humans alone. As such, the more knowledge workers adopt LLMs, the more efficiently they can achieve the same work output!

      If we assume a conservative 10% productivity speed up, adoption across all ~100M knowledge work in the US will recoup the resource cost of a full training run in a few business days, even after accounting for the inference costs!

      Additional reading with more useful numbers (independent of my napkin math):

      https://www.nature.com/articles/s41598-024-76682-6

      https://cacm.acm.org/blogcacm/the-energy-footprint-of-humans...

      • By wlesieutre 2026-02-123:595 reply

        So with the AI is doing more of the work and you need less humans, what are you doing with the extra humans to eliminate their no-longer-productive resource consumption?

        Saying “we can do the same work with less resource use” doesn’t mean resource consumption is reduced. You’ve just gone from humans using resources to humans using the same resources and doing less work, plus AI using more resources.

        • By pmontra 2026-02-127:15

          Resource consumption often goes up. It's a time vs energy tradeoff and it's not free.

          Your question is a variant of what do we do with all those humans now that they don't have to walk miles to the well every day because we invented aqueducts? The point is that they didn't want to walk to the well but they had to (and in some places they still have to) and very few people want to work, even now and even us, but they have to.

          We will see what happens this time when we won't have to walk to that well.

        • By danans 2026-02-124:47

          > So with the AI is doing more of the work and you need less humans, what are you doing with the extra humans to eliminate their no-longer-productive resource consumption?

          Soon enough, we won't be able to avoid this question.

        • By whatshisface 2026-02-125:02

          You put them to work doing more things than were possible in a month before.

        • By keeda 2026-02-125:371 reply

          The thing is, there are many interplaying dynamics here that are impossible to unravel. This is why I called it "napkin math", because figuring out the full ramifications of this change is a pretty large economic problem that nobody has figured out!

          For instance, I think operating at this level of productivity is unsustainable (https://news.ycombinator.com/item?id=46938038). As discussed in detail by the recent "AI vampire" blog: https://news.ycombinator.com/item?id=46972179 -- most humans are not designed for that level of cognitive intensity.

          But even then, the productivity per human will explode, and we will still have the problem of "too many humans." Cynically, if most knowledge workers get laid off, it's good from an environmental perspective because that means much less commuting and pollution! But then they're starving and we will have riots!

          This is where I foresee the near-term problems with GenAI: social turmoil rather than resource consumption. I suspect it's not all bad news though. While it's impossible to put numbers on it, it helps to think about the first-order economic principles that are in play:

          1. This is hand-wavy, but knowledge work boosts economic growth. If this is massively accelerated, we should be creating surplus value that compensates for a lot of costs.

          2. However a huge chunk of knowledge work is busy work which will be automated away. People can try upskilling but the skill gap is already huge an growing quickly and they will lose jobs.

          3. The economy is essentially people providing and paying for services and goods. If people lose jobs and cannot earn, they cannot drive the economy and it shrinks.

          4. The elite, counter-intuitively enough, do NOT want that because they get richer by taking a massive cut of the economy! (Not to mention life in a doomsday bunker can get pretty dull if starving people start rioting -- https://news.ycombinator.com/item?id=46896066)

          There are many more dynamics at play of course, but I think an equilibrium will be found purely because everyone is incentivized to find a solution (UBI?) that keeps both the elites and the plebes living long and prospering. I expect some turmoil, but luckily, the severe resource crunch of GPUs gives us time to figure things out.

          • By keybored 2026-02-1210:401 reply

            What I gather from your analysis and gleeful exclamation marks is that I should start rioting now rather than wait.

        • By _kb 2026-02-128:28

          Turn them into biogas to create more energy for DCs.

      • By danielbln 2026-02-123:541 reply

        Do keep in mind that 1 large prompt every 5 minutes is not how e.g. coding agents are used. There it's 1 large prompt every couple of seconds.

        • By keeda 2026-02-125:52

          True, but I think in these scenarios they rely on prompt caching, which is much cheaper: https://ngrok.com/blog/prompt-caching/

          I have no expertise here, but a couple years ago I had a prototype using locally deployed Llama 2 that cached the context (now deprecated https://github.com/ollama/ollama/issues/10576) from previous inference calls, and reused it for subsequent calls. The subsequent calls were much much faster. I suspect prompt caching works similarly, especially given changed code is very small compered to the rest of the codebase.

      • By gosub100 2026-02-1214:561 reply

        Are you excluding the cost of training the AI from the calculation?

        • By keeda 2026-02-1220:17

          In the initial analysis of a single worker, yes, but when scaling up per-human savings to use by the wider population, the aggregate resource savings compensate for training resource usage within a few days, weeks at most.

      • By what 2026-02-126:053 reply

        How is a human consuming 27 gallons of water in an 8 hour work shift?

    • By Dylan16807 2026-02-123:28

      > "Committing to buying the glass to replace the window I broke in your shop to rob the place, you're welcome."

      Buying electricity isn't inherently destructive. That's a very bad analogy.

      > These things are so hideously inefficient. All of you building these things for these people should be embarrassed and ashamed.

      I'm not arguing that they are efficient right now, but how would you measure that? What kind of output does it have to make per kWh of input to be acceptable? Keep in mind that the baseline of US power use is around 500GW and that currently AI is maybe 10.

    • By epistasis 2026-02-123:26

      Adding new electricity demand to the grid should not be viewed as breaking windows and robbing others. When I bought an EV, I increased my electricity demand a huge amount, but it's not like I'm stealing from my neighbors. No rules were broken. We just need to make sure that I pay enough for my additional demand.

      > AI sector will need at least 50 gigawatts of capacity over the next several years.

      The error bars on this prediction are extremely large. It would represent a 5% increase in capacity in "the next several years" which is only a percent or two per year, but it could also only be 5GW over the next several years. 50GW represents about 1 year of actual grid additions.

      > All of you building these things for these people should be embarrassed and ashamed.

      I'm not building these things, and I think there should be AI critique, but this is far over the top. There's great value for all of humanity in these tools. The actual energy use of a typical user is not much more than a typical home appliance, because so many requests are batched together and processed in parallel.

      We should be ashamed of getting into our cars every day, that's a true harm to the environment. We should have built something better, allowed more transit. A daily commute of 30 miles is disastrous for the environment compared other any AI use that's really possible at the moment.

      Let's be cautious of AI but keep our critiques grounded in reality, so that we have enough powder left to fight the rest of things we need to change in society.

    • By esafak 2026-02-124:361 reply

      How are you measuring efficiency? They're better than most humans, which is what I would need more of as a substitute.

      • By digitalPhonix 2026-02-126:311 reply

        A human consumes about 100 watts when not doing any physical exertion (round number, rule of thumb). So unless you can show an LLM running on 100w compute with capabilities similar to a human, they’re less efficient.

        • By pmontra 2026-02-127:031 reply

          100 W is only the start. Let's say that I consume 100 W all along the day. I use an LLM for coding assistance in the old way of asking questions and copy pasting code. It's much faster than me at writing that code. I don't think it ever works 1 hour for me per day. It's probably 10 cumulative minutes, probably much less. Round it up to 12 minutes to make it 1/5 of a hour or round it down to 6 minutes for a 1/10. So instead of 24 it's 0.1 hours, 240 times less. Those 100 W could be 24000 W and the total power per day would be the same. Is that LLM consuming 24 kW when working for me? No idea but I hope it's less than that.

          Of course I could do all of my coding alone again, but I would be slower. It's like walking to the mall several times per week, several hours per time, instead of once or twice per week with a car, three cumulative hours. I trade a higher energy consumption for more time to do other things and the ability to live far away from shops.

          • By digitalPhonix 2026-02-1214:481 reply

            If you as a human are coding for 24 hours a day as the benchmark for LLM efficiency we have other problems.

            • By pmontra 2026-02-1221:331 reply

              I believe that we consume 100 W on average no matter what we do, except intense physical activities, which consume more.

              • By digitalPhonix 2026-02-131:47

                Right, but you do stuff other than work 24 hours a day, right? You have fun, relax, etc.

                Counting the 100w for 24 hours for a human doesn’t match up with counting the power usage from “AI” for only the 10 minutes it’s doing a task.

                Also - units issue: 100 watts for a day is 2400 watt-hours. It’s a moot point anyway because the power draw for the frontier models is an order of magnitude off that the division by 24 is basically meaningless.

    • By dudisubekti 2026-02-127:08

      Are you against inefficiency or just LLMs? If it's the former, I assure you LLMs are nowhere near the top of the list lol

      You should start from beef industry.

    • By Andys 2026-02-124:21

      Every piece of progress looks like this to begin with.

    • By measurablefunc 2026-02-122:38

      The numbers must go up, there is no other way.

  • By venk12 2026-02-129:354 reply

    We've moved past asking where the energy comes from or how our planet will survive this critical phase.

    These days, it's about framing - every country is scrambling to up their game just to stay in power. The companies that are riding this wave are spending millions in marketing, lobbying and billions on consuming energy so that they can make trillions in valuation.

    I am also an ardent user of AI - but sometimes I do feel guilty when I use so many tokens - because I know I am burning energy, and feeding part of this mission. If there is a solution, I would like to be a part of it.

    • By deaux 2026-02-1210:423 reply

      > I am also an ardent user of AI - but sometimes I do feel guilty when I use so many tokens - because I know I am burning energy, and feeding part of this mission. If there is a solution, I would like to be a part of it.

      This is by far the best article I've seen on it [0]. Which leads me to conclude: if you use coding agents, then yes, it's definitely a concern. Yet if you drive daily, even an EV, it's very small compared to that. Let alone flying. Personally, even if my "AI emissions" are at 10x his estimated usage (they almost certainly aren't), the other sacrifices I make to reduce emissions have such an impact that I'd still be multiple times below the national average.

      Note how the above measures energy usage (kWh), not emissions. For anyone taking fossil fuel transit regularly, whether ICE car/taxi/airplane, AI usage is all but guaranteed to be meaningless compared to their transport emissions. One hamburger is at least 5x more emissions than his "median say with Claude Code", so there's another one. If you're feeling guilty, track how much beef you're eating, cut it down by 20% and use agents to your hearts content.

      Now of course, a different form of AI usage like image generation and especially video generation is incomparably more energy-intensive per query. We'd need separate math on that.

      [0] https://www.simonpcouch.com/blog/2026-01-20-cc-impact/

      • By Deanallen 2026-02-1223:071 reply

        It’s not clear from the article, but do these statistics taken into account the amount of electricity and water required to train the model in addition to inference?

        For example, the article says their daily average use of Claude code is similar to the dishwasher running. Is that just including inference or also training Opus 4.5?

        • By deaux 2026-02-133:32

          This is a great question. To my understanding the industry consensus is that for the big three providers, energy spend on inference had already surpassed training by summer last year, and the former's share only keeps increasing. The problem is that there's no hard data in public.

          What we need to do here is write an article that makes a wild claim in either direction ("99% is inference!"), post it on HN, and wait for the comments to roll in that prove it right or wrong.

      • By moralestapia 2026-02-1214:56

        Indeed, it's all bullshit.

        But it's the bullshit some people like so it's not going to go away soon.

      • By gosub100 2026-02-1214:081 reply

        > the other sacrifices I make

        This frames the dilemma that you just need to make this little sacrifice so Trillion.ai can make it's trillion. We shouldn't sacrifice anything.

        • By deaux 2026-02-1214:272 reply

          That applies the exact same way when talking about buying a new Revuelto from Lamborghini, or even a hamburger from McD's. There's nothing special about Trillion.ai's profit, nor its emissions, that make it any different. If I want to do either of those, and I don't want to feel guilty about it, then I need to make sacrifices elsewhere. A lot more sacrifices than if I spent a day using Trillion.ai to write some code, in truth.

          • By chasing0entropy 2026-02-1214:421 reply

            The key word is "feel" which has a direct and causal relationship to societal programming which is directly impacted if not dictated by media/marketing _ both of which are heavily influenced by big players who encourage the consumer to feel guilty while paying both for the resource they are using AND it's markup which is ostensibly used for marketing and the consumption guilt "feeling" feedback loop grows.

            • By deaux 2026-02-1214:51

              > both of which are heavily influenced by big players who encourage the consumer to feel guilty while paying both for the resource they are using

              Hah, if only. Man, I wish that companies succeeded in doing that, then we'd have a lot more people making such sacrifices. That'd be great.

              No one wants their customer to feel guilty because it makes them less likely to buy the product. It's the worst nightmare of any marketer.

          • By gosub100 2026-02-1214:511 reply

            I agree with your statement, but the difference is we all will pay higher electricity costs whether we use it or not. That's the difference between Mc.D's and AI.

            Yet another example of socialize the costs, privatize the profits (except AI isn't profitable yet, lol)

            • By deaux 2026-02-1214:57

              But that too goes for the others all the same. We all pay with our health because some people fly with private jets, or drive Lambos, or indeed eat hamburgers every day. Our health being quite the more precious resource than our money.

              But even on the subject of electricity costs. It looks like the biggest electricity consuming sector globally is.. the oil industry! So we're back to the Lambo drivers.

    • By xnx 2026-02-1210:28

      > sometimes I do feel guilty when I use so many tokens

      There's nothing particularly worse about money spent on AI vs. anything else. I don't feel guilty for having 6 shirts even though I can only wear one at a time.

    • By pjc50 2026-02-1210:491 reply

      > how our planet will survive this critical phase.

      > trillions in valuation.

      This is more or less literally the "yes we destroyed the planet, but for a brief moment we created trillions in shareholder value" meme. Perhaps we need to take a step back and ask to what extent this benefits humans as humans, not as economic units. Especially given the explicit threat in the AI marketing material to destroy all creative industries and replace human fulfilment and even connection with AI.

      • By javcasas 2026-02-1214:33

        We were told the "paperclip maximizer" as a cautionary tale. Instead, we constructed the "valuation maximizer".

    • By rkuykendall-com 2026-02-1213:201 reply

      There are people who recognize there is a problem and would support collective action to fix it, and there are those that don't. As long as you are in the first group, there's nothing else you can individually do to make a difference.

      • By gosub100 2026-02-1213:531 reply

        What if I was in the first group, but abandoned them after being lied to? Told how important it was for ME to take public transit and drive an EV, only for AI to march in and consume multiple nation's worth of energy. That's not "collective" , that's "rules for thee but not for me".

        • By rkuykendall-com 2026-02-1216:40

          Well yeah, I agree, collective action would be actually collective. I'm not gonna try to prescribe something like taxes or infrastructure changes, but stuff like that.

          You're just describing individual action, which like you said, isn't gonna do anything.

HackerNews