
Samsung’s 60% DRAM price hike tightens supply. Learn how rising DDR5 and DDR4 prices impact buyers, sellers, and used RAM markets in 2026.
Samsung hikes memory chip prices by up to 60% (image credits: samsung.com)
Samsung’s memory price Surges by as much as 60% since September, according to a Reuters report published Friday [1]. The move marks one of the steepest short-term price increases in the DRAM market in years — and underscores how surging demand from AI data centers is straining global supply chains.
The world’s largest memory manufacturer, Samsung, is now commanding significantly higher contract prices from its customers for high-density server DRAM modules. For buyers across the PC, server, and IT hardware industries, the implications could be felt well into 2026.
According to Reuters, Samsung’s contract pricing for a 32 GB DDR5 memory module climbed from roughly $149 in September to about $239 in November 2025 — a jump of more than 60 percent within just two months. Prices for other capacities, such as 16 GB and 128 GB modules, reportedly rose between 40 and 50 percent, while 64 GB and 96 GB units saw increases exceeding 30 percent [1].
A related analysis by Tom’s Hardware [2] noted that these hikes began in late summer, but accelerated rapidly as data center orders for AI workloads soaked up available capacity. By mid-November, contract pricing for several server-grade DRAM products had reached their highest levels since before the pandemic — effectively resetting the market baseline.
Samsung declined to comment publicly on the specific pricing changes. However, analysts cited by Reuters said the move reflects a combination of supply constraints, production prioritization for high-bandwidth memory (HBM), and a global inventory shortage among large buyers such as cloud providers and enterprise OEMs.
The primary catalyst behind the surge is the explosive growth of AI infrastructure.
Every new data center built for large language model (LLM) training or inference consumes vast quantities of DDR5 and HBM memory. With Nvidia’s latest AI accelerators and high-performance servers demanding larger and faster memory pools, suppliers are struggling to keep pace.
Samsung and its peers, SK hynix and Micron Technology, have redirected much of their fabrication capacity to high-end chips used in AI servers. While this shift yields higher margins, it leaves less capacity for traditional DRAM products that power laptops, desktops, and mainstream servers.
Industry observers describe the situation as a “perfect storm”:
AI buildouts are pulling supply upward,
consumer demand is recovering from its 2023 slump, and
manufacturing utilization remains constrained by earlier capital-expenditure cuts.
As Tom’s Hardware summarized, “AI data center build-outs are strangling DRAM supply, forcing contract and spot prices upward across every segment of the market.” [2]
Though the price spikes originated in the contract market, where major buyers negotiate directly with Samsung, Micron, and SK hynix, the effects are now filtering down into retail channels.
Independent market data from PCGamer and DRAMeXchange show that average retail prices for DDR5 kits have doubled year-over-year in late 2025, while even older DDR4 modules have risen 20–30 percent as manufacturers scale back production.
At the system-builder and reseller level, this means margins are tightening — and volatility is rising. Small and mid-sized refurbishers, OEMs, and IT resellers face growing challenges in predicting costs and sourcing reliable inventory.
For example, popular 32 GB DDR5-5600 desktop kits that sold for under $100 in early 2024 are now retailing closer to $180–$200, with some premium kits exceeding $250.
Most analysts now expect the current price up-cycle to persist well into 2026, unless global DRAM output increases significantly. Industry tracker TrendForce predicts that total DRAM supply growth will remain in the single digits through next year [3], even as demand continues to accelerate from both AI and conventional server markets.
For new modules, current contract trends point toward a further 20–40 percent increase in early 2026 if capacity remains constrained. High-performance DDR5 modules — especially 64 GB and 128 GB configurations — are likely to remain scarce and expensive, while legacy DDR4 products could see incremental hikes as production lines are converted or retired.
Manufacturers are expected to maintain these elevated prices to recoup heavy losses incurred during the 2022–2023 memory downturn. With inventory pipelines thin and fabrication utilization near full, a meaningful price correction appears unlikely before late 2026.
The secondary market for memory — including used, pulled, or refurbished modules — is already responding. Historically, used DRAM prices lag new DRAM prices by one or two quarters, but the current surge is compressing that gap.
As new-module prices soar, many resellers are holding existing stock longer, anticipating further gains. Prices for used DDR5 modules have risen 10–25 percent since September, and DDR4 resale values are also climbing. In certain capacities, the gap between new and used modules has narrowed to less than 15 percent, particularly in enterprise-grade 32 GB and 64 GB kits.
That dynamic may persist — or even intensify — if Samsung’s pricing strategy encourages other suppliers to follow suit.
For system builders, data-center operators, and component resellers, this price environment requires careful planning. The combination of long lead times, thin inventory buffers, and steady demand growth means timing purchases strategically could be critical.
Businesses that rely on large-scale memory procurement may benefit from locking in supply contracts early, even at higher prices, to avoid potential shortages later in the cycle. Meanwhile, sellers in the secondary market have an opportunity to capture higher resale margins while demand for used memory remains strong.
For consumers and small IT operators, the key message is caution: expect sustained volatility in both retail and used-module pricing through at least mid-2026.
Samsung’s aggressive price moves have reshaped the dynamics of the memory industry almost overnight. The AI boom has not only transformed how chips are used but also how they are valued — elevating DRAM from a commodity component to a strategic bottleneck in the computing supply chain.
While manufacturers stand to profit from the rebound, buyers across every tier — from hyperscale data centers to local PC resellers — are being forced to adapt. Memory, once abundant and inexpensive, is entering a period of sustained scarcity and strategic importance.
At BuySellRAM.com, we do more than track the market — we empower our clients to act on it. Whether you’re looking to sell used RAM at peak value or lock in pricing on new DDR5 inventory, our real-time insights and supplier relationships give you the advantage.
DRAM prices may be volatile, but opportunities are everywhere for those who move early. Visit our Sell Serve RAM page to get an instant quote or explore our live market updates for new and used memory products.
Sources:
I'm so mad about this, I need DDR5 for a new mini-PC I bought and prices have literally gone up by 2.5x..
128GB used to be 400$ in June, and now it's over $1,000 for the same 2x64GB set..
I have no idea if/when prices will come back down but it sucks.
Dram alternates between feast and famine; it's the nature of a business when the granularity of investment is so huge (you have a fab or you don't, and they cost billions -maybe trillions by now). So, it will swing back. Unfortunately it looks like maybe 3-5 years on average, from some analysis here: https://storagesearch.com/memory-boom-bust-cycles.html
(That's just me eyeballing it, feel free to do the math)
Nothing costs trillions.
Ordered some servers 6 months ago ~12k USD per unit.
Same order, same bill of materials, 17.5K USD per unit today.
That is roughly a 5.5k increase for 768GB of DDR5 ECC memory and the 4 2tb nvme ssds.
I just looked at the invoice for my current PC parts that I bought in April 2016: I paid 177 EUR (~203 USD) for 32GB (DDR4-2800).
It's kinda sad when you grow up in a period of rapid hardware development and now see 10 years going by with RAM $/GB prices staying roughly the same.
Olds remember the years around '95 when RAM stayed the exact same price per megabyte for what seemed a decade.
Aside, $203 USD back then would be about $276 USD after inflation. Not a primary effect, but contributory.
Also we’re likely comparing RAMs at different speeds and memory bandwidth.
I just gave up and built an AM4 system with a 3090 because I had 128G of ddr4 udimms on hand the whole build was for less than just the memory would have cost for an AM5/ddr5 build.
Really wish that I could replace my old skylake-x system but even ddr4 rdimms for an older xeon are crazy now let alone ddr5. Unfortunately I need slots for 3xTitan V's for the 7.450 TFLOPS each of FP64. Even the 5090 only does 1.637 TFLOPS for FP64, so just hopping that old system keeps running.
Wow, no kidding. I checked my BOM for the 9950 build I did a year ago, RAM price has doubled for the exact same DDR5-6000 sticks.
Doubled in the last 4 months https://www.youtube.com/watch?v=o5Zc-FsUDCM
Upgraded by adding 64GB.. last Friday I sold the 32 GB I took out for what I paid for the 64 GB in July... insane
Time to start scouring used-PC sales to reclaim the RAM and sell it for a profit?
Damn I bought a whole computer with 128GB RAM & 16-core Ryzen CPU for £325 a few months ago.
> I have no idea if/when prices will come back down but it sucks.
Years, or when the AI bubble pops, whatever comes first.
Similar situation with QLC flash and HDDs btw.
it's cyclical. just wait 10 years
I'm still on DDR3 :)
Why do we all need 128GB now? I was happy with 32.
Close a few Chrome tabs, and save some DDR5 for the rest of us. :-)
Why did you waste all your money on 32gb when 4gb is enough? Why did we all need 32gb?
I like to tell people I have 128GB. It's pretty rare to meet someone like me that isn't swapping all the time.
All I can say is,
- the insane frothing hype behind AI is showing me a new kind of market failure - where resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns. Even if it squeezes out every single other sector that happens to want to use SDRAM to do things OTHER than buffer memory before it's fed into a PCIE lane for a GPU.
- I'm really REALLY glad i decided to buy brand new gaming laptops for my wife and I just a couple months ago, after not having upgraded our gaming laptops for 7 and 9 years respectively. It seems like gamers are going to have this the worst - GPUs have been f'd for a long time due to crypto and AI, and now even DRAM isn't safe. Plus SSD prices are going up too. And unlike many other DRAM users where it's a business thing and they can to some degree just hike prices to cover - gamers are obviously not running businesses. It's just making the hobby more expensive.
It is a weird form of centralized planning. Except there's no election to get on to the central committee, it's like in the Soviet era where you had to run in the right circles and have sway in them.
There's too much group-think in the executive class. Too much forced adoption of AI, too much bandwagon hopping.
The return-to-office fad is similar, a bunch of executives following the mandates of their board, all because there's a few CEOs who were REALLY worked up about it and there was a decision that workers had it too easy. Watching the executive class sacrifice profits for power is pretty fascinating.
Edit: A good way to decentralize the power and have better decision making would be to have less centralized rewards in the capital markets. Right now are living through a new gilded age with a few barons running things, because we have made the rewards too extreme and too narrowly distributed. Most market economics assumes that there's somewhat equal decision making power amongst the econs. We are quickly trending away from that.
The funniest thing is that somehow the executive class is even more out of touch than they used to be.
At least before there was a certain common baseline derived from everyone watching the same news and reading the same press. Now they are just as enclosed in their thought bubbles as everyone else. It is entirely possible for a tech CEO to have a full company of tech workers despising the current plan and yet that person being constantly reinforced by linkedin and chatgpt.
The out of touch leader is a trope that I'm willing to bet has existed as long as we've had leaders.
I remember first hearing the phrase "yes man" in relation to a human ass kisser my dad worked with in like 1988.
It's very easy to unknowingly surround yourself with syncophants and hangers on when you literally have more money than some countries. This is true now and has been true forever. I'm not sure they're more out of touch, as much as we're way more aware?
It's more than the fact they are surrounded by sycophants. It's also that, despite the mythology the executive-worship-industry tries to paint, CxOs and board members of companies are just not very creative or visionary people. They largely spend their time looking at their peers and competitors for hints about what they should be doing. And today, those hints all are "do AI". They're not sitting down and deriving from first principles that AI is the way--they're seeing their buddies steering other companies and they're all saying AI is the way, so they say AI is the way, too.
Sounds quite a bit like stock market. The more sober and cynical of them see fads as fads, irrational but powerful movements, and ride the waves, selling to a greater fool.
Out-of-touch leaders existed for millennia. The "Emperor's New Clothes" tale was published in 1837 as a reproduction of a much older folk take. Sima Qian criticizes out-of-touch lords and emperors in his book about ancient history, written in 1th century BC. Maybe there is even older evidence.
No surprise, the CxO class barely lives in the same physical world as us peasants. They all hang out together in their rich-people restaurants and rich-people galas and rich-people country clubs and rich-people vacation spots, socializing with other rich-people and don't really have a lot of contact with normal people, outside of a handful of executive assistants and household servants.
We need better antitrust and anti-monopoly enforcement. Break up the biggest companies, and then they'll have to actually participate in markets.
This was Lina Khan's big thing, and I'd argue that our current administration is largely a result of Silicon Valkey no longer being able to get exits in the form or mergers and IPOs.
Perhaps a better approach to anti-monopoly and anti-trust is possible, but I'm not sure anybody knows what that is. Khan was very well regarded and I don't know anybody who's better at it.
Another approach would be a wealth and income taxation strategy to ensure sigmoid income for the population. You can always make more, but with diminishing returns to self, and greater returns to the rest of society.
Sorry, how did she stand in the way of IPOs? She was against the larger players providing easy off-ramps to smaller players but I don’t recall anything about IPOs. Indeed, Figma’s IPO is precisely because she undid the pending Adobe / Figma merger if I recall correctly.
a better approach might be to farming out shares to stakeholders. that seems a lot more dynamic and self-correcting than periodic taxation battles after the fact
Khan was largely ineffectual. The current administration, if it can be blamed on SV at all, is more likely to be the result of Harris's insanely ill-timed proposal to tax unrealized capital gains just as election season was kicking into high gear.
IMO Khan was by far the best we've had in at least 2 decades. Her FCC even got a judge to rule to break up Google! The biggest downside Khan had was being attached to a 1 term president. There's just not that many court cases against trillion dollar companies you can take from investigation to winning the appeal on in 4 years
All true, and I'm not making a value statement about whether her influence was good or bad. However, Khan only threatened the oligarchs' companies, while Harris point-blank threatened their fortunes.
Don't pick a fight with people who buy ink by the barrel and bandwidth by the exabyte-second. Or at least, don't do it a month before an election.
The oligarchs hated Kahn with the intensity of a thousand burning suns. If you listened to All In all they were doing is ranting about her and Gary Gensler.
That being said, Kamala's refusal to run on Kahn's record definitely helped cost her the election. She thought she could play footsie with Wall Street and SV by backchanneling that she would fire Kahn, so she felt like she couldn't say anything good about Kahn without upsetting the oligarchs, but what she was doing was really popular.
Samsung lost a large percentage of market share to their competitors in the last couple years, so I'm pretty sure they already have to participate in markets.
Well, assuming they haven't revived the cartel.
Yea when I think of DRAM I think of SK Hynix and Micron with Samsung far behind.
I think a better solution is exponential tax on a company size. I.e. once a company starts to earn above, say, 1 billion, it will be taxed by income by ever increasing amount. Or put it another way, use taxes to break the power law and winner takes effect all into a Gaussian distribution of company sizes.
> I think a better solution is exponential tax on a company size. I.e. once a company starts to earn above, say, 1 billion, it will be taxed by income by ever increasing amount.
This is in the right spirit but you want two things to be different about it.
The first is that the threshold for a given industry doesn't make sense as a dollar amount, it makes sense as a market share percentage. Having more than 15% market share should be a thing companies don't want, regardless of whether it's a $100 trillion industry or a $100 million one.
And the second is that taxes create a perverse incentive for the government. You absolutely do not want the government to have even more of a financial incentive to sustain and create more of the companies of that size. What you want is to have fewer of them.
So, what you want is a rule that if a company has more than 15% market share, the entire general public is allowed to sue them into bankruptcy for the offense of market consolidation. Which also removes the problem where they buy off the government prosecutors, because if they commit the offense then anybody can sue them.
> anybody can sue them.
who bears the costs of this suit?
And who determines what makes for a good market share size to be the threshold?
And by having such a rule, an industry that would have higher efficiency to when consolidated would not be able to (but you wouldn't know). It's a bad set of policy imho.
A better way would be for gov't to increase competition by adding supply, or demand, whichever one is the bottleneck to competition. If a company, such as AWS, is getting a lot of marketshare, but their profit margins is still high, then the gov't should incentivize competition by funding or giving loans to businesses that want to compete with AWS.
However, if AWS's profit margins, even at high market share, remains very low (e.g., amazon's commerce side), then there's no need for the gov't to "step in" at all, as there would be no incentive for any competitor to try enter the market due to low margins.
This would permanently increase DRAM prices. Memory fabricators either earn billions of dollars in income each year or they can't keep going. There are no little Mom and Pop businesses that can do photolithography on leading process nodes.
Nonsense, it would force vertical de-integration.
Chip fabs used to be like book publishers; you don't have to own a printing press to be an author. Carver Mead even described his vision of the industry that way.
Nowadays you have to get your cell libraries and a large chunk of your toolchain from the fab. Of course it's laundered through cadence+synopsys, but it's still coming from the fab. You have to buy your masks from the fab (heck they aren't even allowed to leave the fab so do you really own them?). And on and on.
For the record I don't agree with the "exponential" part, but otherwise this is an underappreciated and powerful technique.
In another comment you proposed a sane version of the parent proposal. I wouldn't have commented if fpoling had originally floated that scheme. I was mainly objecting to drastically increasing taxes "once a company starts to earn above, say, 1 billion" without regard for the minimum viable scale of different businesses.
Is that revenue, or profit? If revenue, it'll slam certain kinds of high-volume low-profit businesses, and if it's profit then the company will just arrange to have big compensation "expenses" for executives.
The latter would have to be backstopped by taxes on individual income.
The sane version of this proposal omits the "exponential" part, applies to profits (net income), and makes the tax rate industry-specific (just like Washington State's revenue tax).
Set limits so the top cant earn more than x times the lowest paid in the company then.
Ah yes, the same tax mentality that is working great for EU innovation.
Corporate taxes specifically were quite high by European standards until 2027 and are not relatively that low today either
> There's too much group-think in the executive class.
I think this is actually the long tail of "too big to fail." It's not that they're all thinking the same way, it's that they're all no longer hedging their bets.
> we have made the rewards too extreme and too narrowly distributed
We give the military far too much money in the USA.
> We give the military far too much money in the USA.
~ themafia, 2025
(sorry)
On a more serious note the military is sure a money burning machine, but IMHO it's only government spending, when most of the money in the US is deliberately private.
The fintech sector could be a bigger representation of a money vacuuming system benefiting statistically nobody ?
It's around 3.4% GDP. That puts us in the top 10% or so worldwide, but it's not ridiculously high. It's on a similar level as countries such as Morocco and Colombia, which aren't known for excessive military spending. It's still kind of high for a country with no nearby enemies, but for the most part, US military spending is large because the US economy is large.
Military spending is a type of wealfare for the wealthy it is one of the only forms of public or government spending that doesn't crowd out private investors, the way public housing or publicly funded hospitals do. The high military spending and the contractor class often vote more conservative than typical for their demographic and economic peers It's been high since WW2, with maybe a slight drop in the late 70s. The current stat of "3.4 times gdp" ignores the fact that a large part of our national debt is from the military and war budgets. I saw a statistic in the mid 1990s that if we had kept our military budget at inflation adjusted levels equal to 1976 our debt would have gone to zero as early as 1994.
It's around 16% of the total federal budget. To be fair about 1/3 of "military spending" is actually Salaries, Medical, Housing and GI/Retirement costs.
It's also the case that none of the CIA, NSA or DHS budgets show up under the military, even though they're performing some of the same functions that would be handled by militaries in other countries.
We also have "black appropriations." So the total of the spending on surveillance and kinetic operations is often unknowable. Add to this the fact the Pentagon has never successfully performed an audit and I think people are right to be suspicious of the topline "fraction of GDP" number.
Just want to point out that the NSA is part of the DoD. (Or DoW now)
This is true; however, their agency budget is not part of the DoD's budget and is not included in the reported "total" for DoD.
At least not in the data set I use:
I think the number is probably much higher than we think - there is probably a ton of not so obvious spending on research and development.
Centralized planning is needed in any civilization. You need some mechanism to decide where to put resources, whether it's to organize the annual school's excursion or to construct the national highway system.
But yeah in the end companies behave in trends, if some companies do it then the other companies have to do it too, even if this makes things less efficient or is even hurtful. We can put that onto the human factor, but I think even if we replaced all CEOs with AIs, those AIs would all see the same information and make similar decisions on those information.
There is pascal's wager arguments to be had: for each individual company, the punishment of not playing the AI game and missing out on something big is bigger than the punishment of wasting resources by allocating them towards AI efforts plus annoying customers with AI features they don't want or need.
> Right now are living through a new gilded age with a few barons running things, because we have made the rewards too extreme and too narrowly distributed.
The usa has rid itself multiple times of its barons. There is mechanisms in place, but I am not sure that people really are going to exercise those means any time soon. If this AI stuff is successful in the real world as well, then increasing amounts of power will shift away from the people to the people controlling the AI, with all the consequences this has.
If you get paid for being rich in proportion to how rich you are -- because that's how assets work -- it turns into an exponential, runs away, and concentrates power until something breaks.
>It is a weird form of centralized planning [...]
It's a form of "centralized planning", except it's not centralized at all.
And there’s no planning
how is this centralized planning? It’s a corporate decision making operating in a free market to optimize for what majority shareholders want (though the majority of shares are owned by few).
A free market where the government participates with billions in investment and tax cuts, yes.
Your parenthetical is how. It's not completely centralized, but it is being decided by a very small number of people.
I think the implied thought (?) is there is a similarity between central planning and oligopoly bandwagoning. To my eye, the causes and dynamics are different enough to warrant bucketing them separately.
Every corporation is a (not so) little pocket of centrally planned economy.
The only saving grace is that it can die and others will scoop up released resources.
When country level planned economy dies, people die and resources get destroyed.
> Every corporation is a (not so) little pocket of centrally planned economy.
This is confused. Here is how classical economists would frame it: a firm chooses how much to produce based on its cost structure and market prices, expanding production until marginal cost equals marginal revenue. This is price guided production optimization, not central planning.
The dominant criticism of central planning is trying to set production quantities without prices. Firms (generally) don’t do this.
This is why I think taxes on the very wealthy should be so high that billionaires can't happen. The usual reasons are either about raising revenue or are vague ideas about inequality. It doesn't raise enough revenue to matter, and inequality is a fairly weak justification by itself.
But the power concentration is a strong reason. That level of wealth is incompatible with democracy. Money is power, and when someone accumulates enough of it to be able to personally shake entire industries, it's too much.
[dead]
I disagree.
We have been living on the investment of previous centuries and decades in the West for close to 40 years now. Everything is broken but that didn't matter because everything that needed a functioning physical economy had moved to the East.
AI is the first industrial breakthrough in a century that needs the sort of infrastructure that previous industrial revolutions needed: namely a ton of raw power.
The bubble is laying bare just how terrible infrastructure is and how we've ignored trillions of maintenance to give a few thousand people tax breaks they don't really need.
Why not follow the time-honoured approach and put the data centres in low-income countries?
Because you only do that once the tech has been comodatized and you have wrung all the benefit for your country that you can.
The British didn't industrialise Indian for a reason.
I assume they don't have good enough power infrastructure.
>AI is the first industrial breakthrough in a century
Is it?
> the insane frothing hype behind AI is showing me a new kind of market failure - where resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns.
This resonates deeply, especially to someone born in the USSR.
This is part of how free markets self correct, misallocate resources and you run out of resources.
You can blame irrational exuberance, bubbles, or whatnot markets are ultimately individual choices times economic power. Ai, Crypto, housing, Dotcom etc going back through history all had excess because it’s not obvious when to join and when to stop.
Usually companies run out of resources before they screw up global prices in massive markets.
If it was a couple billion dollars of memory purchasing nobody would care.
> Usually companies run out of resources before they screw up global prices in massive markets.
It happens more often than you might expect.
The Onion Futures Act and what led to it is always a fun read: https://en.wikipedia.org/wiki/Onion_Futures_Act
Just like some of the crypto booms and busts if you time it right this could be a good thing. Buy on a refresh cycle when AWS dumps a bunch of chips and RAM used or refurbished (some places even offer warranty which is nice).
And if the market crashes or takes a big dip then temporarily eBay will flood with high end stuff at good prices.
Sucks for anyone who needs to upgrade in the next year or two though !
They're treating it as a "winner takes it all"-kind of business. And I'm not sure this is a reasonable bet.
The only way the massive planned investments make sense is if you think the winner can grab a very large piece of a huge pie. I've no idea how large the pie will be in the near future, but I'm even more skeptical that there will be a single winner.
What's odd about this is I believe there does exist a winner takes all technology. And that it's AR.
The more I dream about the possibilities of AR, the more I believe people are going to find it incredibly useful. It's just the hardware isn't nearly ready. Maybe I'm wrong but I believe these companies are making some of the largest strategic blunders possible at this point in time.
Why would AR be particularly likely to have a single winnner?
It's not exactly a new type of failure. It's roughly equivalent to Riccardian rent, or pecuniary externalities for the general term. Though I suppose this is a speculative variant, which could be worse somehow.
why do you think allocating hardware to gamers is proper usage?
maybe AI cures cancer, or at least writes some code
For example: allocating the resources to only few industries deprives everyone else: small players, hobbyists, gamers, tinkerers from opportunities to play with their toys. And small players playing with random toys is a source of multiple innovations.
Unless I get all the resources I want, when I want, all at low prices, the market has obviously failed.
Gamers at least enjoy their GPUs and memory.
The tone from the AI industry sounds more like a dependent addict by comparison. They're well past the phase where they're enjoying their fix and into the "please, just another terawatt, another container-ship full of Quadros, to make it through the day" mode.
More seriously, I could see some legitimate value in saying "no, you can't buy every transistor on the market."
It forces AI players to think about efficiency and smarter software rather than just throwing money at bigger wads of compute. This might be part of where China's getting their competitive chops from-- having to do more with less due to trade restrictions seems to be producing some surprisingly competitive products.
It also encourages diversification. There is still no non-handwavey road to sustainable long-term profitability for most of the AI sector, which is why we keep hearing answers like "maybe the Extra Fingers Machine cures cancer." Eventually Claude and Copilot have to cover their costs or die. If you're nVidia or TSMC, you might love today's huge margins and willing buyers for 150% of your output, but it's simple due diligence to make sure you have other customers available so you can weather the day the bubble bursts.
It's also a solid PR play. Making sure people can still access the hobbies they enjoy is an easy way to say you're on the side of the mass public. It comes from a similar place to banning ticket scalping or setting reasonable prices on captive concessions. The actual dollars involved are small (how many enthusiast PCs could you outfit with the RAM chips or GPU wafer capacity being diverted to just one AI data centre?) but it makes it look like you're not completely for sale to the highest bidder.
Or you could look at reality where it generates fake social media posts s lot and we could all ask, why is this valuable?
What do you think happens when the majority of consumers are priced not only out of bread, but also circuses?
It’s maybe new to you (you’re one of today’s lucky 10,000!), but this kind of market failure has been going on since at least the south sea bubble and tulip mania, if not all the way back to Roman times.
I wonder, is there any way to avoid this kind of market failure? Even a planned economy could succumb to hype - promises that improved societal efficiency are just around the corner.
> Is there any way to avoid this kind of market failure?
There are potentially undesirable tradeoffs and a whole new game of cheats and corruption, but you could frustrate rapid, concentrated growth with things like an increasing tax on raised funds.
Right now, we basically let people and companies concentrate as much capital as they want, as rapidly as they want, with almost no friction, presumably because it helped us economically outcompete the adversary during the Cold War. Broadly, we're now afraid of having any kind of brake or dampener on investments and we are more afraid of inefficiency and corruption if the government were to intervene than we are of speculation or exploitation if it doesn't.
In democratically regulated capitalism, there are levers to pull that could slow down this kind of freight train before it were to get out of control, but the arguments against pulling them remain more thoroughly developed and more closely held than those in favor of them.
> there are levers to pull that could slow down this kind of freight train before it were to get out of control
Care to share some keywords here?
There is a way, and if anyone tells you we have to go full Hitler or Stalin to do it they are liars because last time we let inequality cook this hard FDR and the New Deal figured out how to thread the needle and proved it could be done.
Unfortunately, that doesn't seem to be the flavor of politics on tap at the moment.
Sam Altman cornering the DRAM market is a joke, of course, but if the punchline is that they were correct to invest this amount of resources in job destruction, it's going to get very serious very quickly and we have to start making better decisions in a hurry or this will get very, very ugly.
A tax on scale.
Yeah I know HN is going to hate me for saying that.
If a big company and a few small companies all have identical costs for producing a product, society is better served by having it produced by the few small companies than the one big company.
Once "better served" is quantified, you know the coefficient for taxation.
Make no mistake, this coefficient will be a political football, and will be fought over, just like the Fed prime interest rate. But it's a single scalar instead of a whole executive branch department and a hundred kilopages of regulations like we have in the antitrust-enforcement clusterfuck. Which makes it way harder to pull shenanighans.
> If a big company and a few small companies all have identical costs for producing a product, society is better served by having it produced by the few small companies than the one big company.
Why? That's exactly the circumstances where the mere potential for small companies to pop up is enough to police the big company's behavior. You get lower costs (due to economies of scale) and a very low chance of monopolization. so everyone's happy. In the case of this DRAM/flash price spike, the natural "small" actors are fabs slightly off the leading edge, that will be able to retool their production and supply these devices for a higher profit.
the mere potential for small companies to pop up is enough to police the big company's behavior.
If that were true, "you're in Amazon's kill zone" wouldn't be something VC's say to startups. And yet, they do say that.
> If a big company and a few small companies all have identical costs for producing a product, society is better served by having it produced by the few small companies than the one big company.
How so? Costs will be higher with multiple small products, resulting in higher costs for customers. That's the opposite of "society is served better".
We draw the line at monopolies, which makes sense.
This happens when you get worse and worse inequality when it comes to buying power. The most accurate prediction into how this all plays out I think is what Gary Stevenson calls "The Squeeze Out" -> https://www.youtube.com/watch?v=pUKaB4P5Qns
Currently we are still at the stage of extraction from the upper/middle class retail investors and pension funds being sucked up by all the major tech companies that are only focused on their stock price. They have no incentive to compete, because if they do, it will ruin the game for everyone. This gets worse, and the theory (and somewhat historically) says it can lead to war.
Agree with the analysis or not, I personally think it is quite compelling to what is happening with AI, worth a watch.
Markets are voting machines in the short term and weighing machines in the long term. We’re in the short term popularity phase of AI at the moment. The weighing will come along eventually.
> the insane frothing hype behind AI is showing me a new kind of market failure
I see people using "market failure" in weird ways lately. Just because someone thinks a use for a product isn't important, doesn't mean it's a market failure. It's actually the opposite - consumers are purchasing it at a price they value it.
Someone who doesn't really need 128GB of ram won't pay the higher cost, but someone who does need it will.
Going to be awesome tho when OpenAI et al fail because the market is going to be flooded with cheap parts.
Or not cause inflation, rising cost of living etc. People said the same about crypto GPUs but it never really happened in the end. Those cheap pre-LHR RTX cards never really entered the picture.
> resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns
That's basically what the rich usually do. They command disproportionate amount of resources and misallocate them freely on a whim, outside of any democratic scrutiny, squeezing incredible number of people and small buisness out of something.
Whether that's a strength of the system or the weakness, I'm sure some rearch will show.
> … showing me a new kind of market failure - where resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns.
Technically speaking, this is not a market failure. [1] Why? Per the comment above, it is the individuals that are acting irrationally, right? The market is acting correctly according to its design and inputs. The market’s price adjustment is rational in response. The response is not necessarily fair to all people, but traditional styles of neoclassical economic analysis deaccentuate common notions of fairness or equality; the main goal is economic efficiency.
I prefer to ask the question: to what degree is some particular market design serving the best interest of its stakeholders and society? In democracies, we have some degree of choice over what we want!
I say all of this as a person who views markets as mechanisms not moral foundations. This distinction is made clear when studying political economic (economics for policy analysis) though I think it sometimes gets overlooked in other settings.
If one wants to explore coordination mechanisms that can handle highly irrational demand spikes, you have to think hard. To some degree, one would have to give up a key aspect of most market systems — the notion of one price set by the idea of “willingness to pay”.
[1] Market failure is a technical term within economics meaning the mechanism itself malfunctions relative to its own efficiency criteria.
> where resources can be massively misallocated
It's a little ironic but to call this a market failure due to resource misalocation because prices are high when high prices is how misalocation is avoided.
I'm a little suspicious that "misalocation" just means it's too expensive for you. That's a feature, not a bug.
how is that market failure??? this is literally market of supply and demand at its core
> the insane frothing hype behind AI is showing me a new kind of market failure - where resources can be massively misallocated just because some small class of individuals THINK or HOPE it will result in massive returns.
As someone who advocates that we only use capitalism as a tool in specific areas and try to move past it in other, I’ll defend it here to say that’s not really a market anymore when this happens.
Hyper concentration of wealth is going to lead to the same issues that command economies have where the low level capital allocations(buying shit) isn’t getting feedback from everyone involved and is just going off one asshole’s opinion
Not even. Tulips were non-productive speculative assets. NFTs were what the tulip was. The AI buildout is more like the railroad mania in the sense that there is froth but productive utility is still the output.
[dead]
Games eventually will move to consoles and the whole PC industry will take a huge hit.
console ram isn't magically cheaper
I don't know if the term console even makes sense any more. It's a computer without a keyboard and mouse. And as soon as you do that, it's a PC. So I don't see how this makes any sense or will ever happen.
Actually, a console is worse than a PC. It's main reason for existence is to enforce DRM on the user to protect copyright/IP.
Well, patience as a consumer might pay off in the next year or so when the music stops and hyperscalers are forced to dump their inventories.
There still isn't a clear path to profitability for any of these AI products and the capital expenditure has been enormous.
> Well, patience as a consumer might pay off in the next year or so when the music stops and hyperscalers are forced to dump their inventories.
Their inventories are not what consumers use.
Consumer DDR5 motherboards normally take UDIMMs. Server DDR5 motherboards normally take RDIMMs. They're mechanically incompatible, and the voltages are different. And the memory for GPUs is normally soldered directly to the board (and of the GDDRn family, instead of the DDRn or LPDDRn families used by most CPUs).
As for GPUs, they're also different. Most consumer GPUs are PCIe x16 cards with DP and HDMI ports; most hyperscaler GPUs are going to have more exotic form factors like OAM, and not have any DP or HDMI ports (since they have no need for graphics output).
So no, unfortunately hyperscalers dumping their inventories would be of little use to consumers. We'll have to wait for the factories to switch their production to consumer-targeted products.
Edit: even their NVMe drives are going to have different form factors like E1.S and different connectors like U.2, making them hard for normal consumers to use.
I imagine the cost is primarily in the actual DRAM chips on the DIMM. So availability of RDIMMs on the market will affect DRAM prices anyway. These days lots of motherboards come with Oculink, etc. and you can get a U.2 PCIe card for rather cheap.
I put together a small server with mostly commodity parts.
The problem is that it is not entirely clear that the hyperscalers are buying DDR5, instead it seems that supplies are being diverted so that more HBM/GDDR wafers can be produced.
HBM/GDDR is not necessarily as useful to the average person as DDR4/DDR5
Its a bit of a shame these AI GPUs don't actually have displayport/hdmi output ports because they would make for nice cheap and powerful gaming GPUs with a lot of VRAM, they would potentially be really good graphics cards.
Will just have to settle for insanely cheap second hand DDR5 and NVMe drives I guess.
AI GPUs suck for gaming, I have seen a video from a guy playing Red Dead Redemption 2 on a H100 at a whooping 8 FPS! That's after some hacks, because otherwise it wouldn't run at all.
AI GPUs are stripped away of most things display-related to make room for more compute cores. So in theory, they could "work", but there are bottlenecks making that compute power irrelevant for gaming, even if they had a display output.
I wouldn't mind my own offline Gemini or ChatGPT 5. But even if the hardware and model were free, I don't know how I'd afford the electricity.
A single machine for personal inference on models of this size isn't going to idle at some point so high that electricity becomes a problem and for personal use it's not like it would be under load often and if for some reason you are able to keep it under heavy load presumably it's doing something valuable enough to easily justify the electricity.
If you can't afford the electricity to afford to run the model on free hardware, you'd certainly never be able to afford the subscription to the same product as a service!
But anyway, the trick is to run it in the winter and keep your house warm.
I think you're underestimating economies of scale, and today's willingness of large corporations to provide cutting-edge services at a loss.
[dead]