I feel like this is a mistake. Pat's strategy is aggressive but what the company needs.
Intel's stock is jumping at this announcement, but I look at it as a bad signal for Intel 18a. If 18a was looking to be a smash hit then I don't think Pat gets retired. If 18a is a success then it is an even more short-sighted decision by the board.
What this likely means is two-fold:
1. Intel 18a is being delayed further and/or there are significant issues that will hamstring performance.
2. Pat is/was unwilling to split the foundry and design business / be part of a M&A but the board wants to do one or the other.
If 18a is not ready I think the best case scenario for Intel is a merger with AMD. The US Govt would probably co-sign on it for national security concerns overriding the fact that it creates an absolute monopoly on x86 processors. The moat of the two companies together would give the new combined company plenty of time to ramp up their fabs.
>> If 18a is not ready I think the best case scenario for Intel is a merger with AMD.
Aside from the x86 monopoly that would create, I don't think Intel has much of value to AMD at this point other than the fabs (which aren't delivering). IMHO if Intel is failing, let them fail and others will buy the pieces in bankruptcy. This would probably benefit several other companies that could use 22nm and up fab capacity and someone could even pick up the x86 and graphics businesses.
BTW I think at this point the graphics business is more valuable. Even though Intel is in 3rd place there are many players in the SoC world that can use a good GPU. You can build a SoC with Intel, ARM, or RISC-V but they all need a GPU.
Certainly feels like preempting news that Intel 18A is delayed.
Restoring Intel's foundry lead starting with 18A was central to Pat's vision and he essentially staked his job on it. 18A is supposed to enter production next year but recent rumors is that it's broken.
The original "5 Nodes in 4 Years" roadmap released in mid 2021 had 18A entering production 2H 2024. So it's already "delayed". The updated roadmap has it coming in Q3 2025 but I don't think anyone ever believed that. This after 20A was canceled, Intel 4 is only used for the Compute Tile in Meteor Lake, Intel 3 only ever made it into a couple of server chips, and Intel 7 was just renamed 10nm.
> Certainly feels like preempting news that Intel 18A is delayed.
I think at this point no one believes Intel can deliver. So news or not..
Intel GFX held back the industry 10 years. If people thought Windows Vista sucked it was because Intel "supported" it by releasing integrated GPUs which could almost handle Windows Vista but not quite.
The best they could do with the GFX business is a public execution. We've been hearing about terrible Intel GFX for 15 years and how they are just on the cusp of making one that is bad (not terrible). Most people who've been following hardware think Intel and GFX is just an oxymoron. Wall Street might see some value in it, but the rest of us, no.
My understanding is that most of the complaints about Vista being unstable came from the nvidia driver being rather awful [1]. You were likely to either have a system that couldn't actually run Vista or have one that crashed all the time, unless you were lucky enough to have an ATI GPU.
[1] https://www.engadget.com/2008-03-27-nvidia-drivers-responsib...
> If people thought Windows Vista sucked it was because Intel "supported" it by releasing integrated GPUs which could almost handle Windows Vista but not quite.
What does an OS need a GPU for?
My current laptop only has integrated Intel GPU. I'm not missing Nvidia, with its proprietary drivers, high power consumption, and corresponding extra heat and shorter battery life...
For a smaller gripe: they also bought Project Offset, which looked super cool, to turn into a Larabee tech demo. Then they killed Larabee and Project Offset along with it.
> Intel GFX held back the industry 10 years. If people thought Windows Vista sucked it was because Intel "supported" it by releasing integrated GPUs which could almost handle Windows Vista but not quite.
not sure about it. i had friends with discrete GPUs at the time and they told me that vista was essentially a gpu-stress program rather than an OS.
at the same time, compiz/beryl on linux worked beautifully on intel integrated gpus, and were doing way cooler things than vista was doing at the time (cube desktops? windows bursting into flames when closed?).
I'm a bit sad that compiz/beryl is not as popular anymore (with all the crazy things it could do).
I've been playing Minecraft fine with Intel GPUs on Linux for about 15 years. Works great. If Windows can't run with these GPUs, that's simply because Windows sucks.
I wonder how big a downside an x86 monopoly would actually be these days (an M4 MacBook being the best perf/watt way to run x86 Windows apps today as it is) and how that compares to the downsides of not allowing x86 to consolidate efforts against rising competition from ARM CPUs.
The problem with the "use the GPU in a SoC" proposition is everyone that makes the rest of a SoC also already has a GPU for it. Often better than what Intel can offer in terms of perf/die space or perf/watt. These SoC solutions tend to coalesce around tile based designs which keep memory bandwidth and power needs down compared to the traditional desktop IMR designs Intel has.
That’s actually a pretty good pint, honestly
I'd like to address the aside for completeness' sake.
An x86 monopoly in the late 80s was a thing, but not now.
Today, there are sufficient competitive chip architectures with cross-compatible operating systems and virtualization that x86 does not represent control of the computing market in a manner that should prevent such a merger: ARM licensees, including the special case of Apple Silicon, Snapdragon, NVIDIA SOCs, RISC-V...
Windows, MacOS and Linux all run competitively on multiple non-x86 architectures.
> An x86 monopoly in the late 80s was a thing, but not now.
Incorrect, we have an even greater lack of x86 vendors now than we did in the 80s. In the 80s you had Intel, and they licensed to AMD, Harris, NEC, TI, Chips & Technologies, and in the 90s we had IBM, Cyrix, VIA, National Semi, NexGen, and for a hot minute Transmeta. Even more smaller vendors.
Today making mass market x86 chips we have: Intel, AMD, and a handful of small embedded vendors selling designs from the Pentium days.
I believe what you meant was that x86 is not a monopoly thanks to other ISAs, but x86 itself is even more of a monopoly than ever.
> An x86 monopoly in the late 80s was a thing, but not now.
I think you're off by 20 years on this. In the 80s and early 90s we had reasonable competition from 68k, powerpc, and arm on desktops; and tons of competition in the server space (mips, sparc, power, alpha, pa-risc, edit: and vax!). It wasn't till the early 2000s that both the desktop/server space coalesced around x86.
X86 is more than just the ISA. What’s at stake is the relatively open PC architecture and hardware ecosystem. It was a fluke of history that made it happen, and it would be sad to lose it.
I was only just today looking for low-power x86 machines to run FreePBX, which does not yet have an ARM64 port. Whilst the consumer computing space is now perfectly served by ARM and will soon be joined by RISC-V, if a widely-used piece of free and open source server software is still x86-only, you can bet that there are thousands of bespoke business solutions that are locked to the ISA. A monopoly would hasten migration away from these programs, but would nonetheless be a lucrative situation for Intel-AMD in the meantime.
I believe the "x86 monopoly" was meant to refere to how only Intel and AMD are legally allowed to make x86 chips due to patents. X86 is currently a duopoly, and if Intel and AMD were to merge, that would become a monopoly.
This is all very true and why I think a merger between AMD and Intel is even possible. Nvidia and Intel is also a possible merger, but I actually think there is more regulatory concern with NVIDIA and how big and dominant they are becoming.
> An x86 monopoly in the late 80s was a thing, but not now
And then in the 2000s after AMD64 pretty much destroyed all competing architectures and then in the 2010s Intel itself effectively was almost a monopoly (outside of mobile) with AMD being on the verge of bankruptcy.
> x86 monopoly
Wintel was a duopoly which had some power: Intel x86 has less dominance now partly because Windows has less dominance.
There are some wonderful papers on how game theory and monopoly plays out between Windows and Intel; and there's a great paper with analysis of why AMD struggled against the economic forces and why Microsoft preferred to team up with a dominant CPU manufacturer.
Yeah, what both companies would need to be competitive in the GPU sector is a cuda killer. That's perhaps the one benefit of merging Antel can more easily standardize something.
You don't get a CUDA killer without the software infrastructure.
Intel finally seem to have got their act together a bit with OneAPI but they've languished for years in this area.
This meme comes up from time to time but I'm not sure what the real evidence for it is or whether the people repeating it have that much experience actually trying to make compute work on AMD cards. Every time I've seen anyone try the problem isn't that the card lacks a library, but rather that calling the function that does what is needed causes a kernel panic. Very different issues - if CUDA allegedly "ran" on AMD cards that still wouldn't save them because the bugs would be too problematic.
There are already packages that let people run CUDA programs unmodified on other GPUs: see https://news.ycombinator.com/item?id=40970560
For whatever reason, people just delete these tools from their minds, then claim Nvidia still has a monopoly on CUDA.
Why doesn’t NVIDIA buy intel? They have the cash and they have the pairing (M chips being NVIDIA and intel’s biggest competitors now). It would be an AMD/ATI move, and maybe NVIDIA could do its own M CPU competitor with…whatever intel can help with.
Why would you want this kind of increased monopolization? That is, CPU companies also owning the GPU market?
WGSL seems like a nice standard everyone could get behind
Reuters has some inside information: https://www.reuters.com/business/intel-ceo-pat-gelsinger-ret...
>Gelsinger, who resigned on Dec. 1, left after a board meeting last week during which directors felt Gelsinger's costly and ambitious plan to turn Intel around was not working and the progress of change was not fast enough, according to a person familiar with the matter. The board told Gelsinger he could retire or be removed, and he chose to step down, according to the source.
Thanks, finally a signal in this thread of noise. I found it unbelievable that all media presented it as if it was his decision. Of course it wasn't.
I "predicted" this three months ago (really, it was inevitable), but gave it 1-6 months.
Now for the what-happens-next popcorn. In a normal world, they would go bankrupt, but this is very much not a normal world.
There are precious few companies who can turn rocks into thinking machines, Intel is one of them for better or worse. It’s a national security issue.
A lot of people on this thread are underestimating how much of a hold Intel has on the chips industry. In my experience, Intel is synonymous with computer chip for the average person. Most people wouldn't be able to tell you what AMD does differently, they'd just say they're a knockoff Intel. Technologically, both companies are neck and neck. But for the average person, it's not even close.
Marketing campaigns only go so far. They’ve been riding the “Intel Inside” slogan for 25 years.
In the mean time, AMD/ARM already won phones, table and game consoles.
Server purchasing decisions aren’t made by everyday people. Intel’s roadmap in that space slipped year for year for at least 10 of the last 15 years.
That leaves Intel with the fraction of the non-mac laptop market that’s made up of people that haven’t been paying attention for the last ten years, and don’t ask anyone who has.
I work in video games and I think it is still sometimes a problem to use computers that are not based on x86 processors, both in the tool chains and software /engines. People here say that Intel has lost out on consoles and laptops, but in gaming that is because of x86 compatible AMD chips. Apple laptops were good for gaming when they had x86 and could duel boot. I see bugs people report on games made for Macs with x86 that don't work quite right with an Mx chip (though not a huge number).
A friend who worked in film post production was telling me about similar rare but annoying problems with Mx Apple computers. I feel like their are verticals where people will favor x86 chips for a while yet.
I am not as close to this as I was when I actually programmed games (oh so long ago!) so I wonder if this is just the point of view of a person who has lost touch with trends in tech.
>In the mean time, AMD/ARM already won phones, table and game consoles.
Don't forget laptops. Intel has been terrible on laptops due to their lack of efficiency. AMD has been wiping the floor with them for years now.
2024 is the first year that Intel has released a laptop chip that can compete in efficiency. I hope Intel continues to invest in this category and remain neck and neck with AMD if we have any hope of having Windows laptops with decent battery lide.
>That leaves Intel with the fraction of the non-mac laptop market that’s made up of people that haven’t been paying attention for the last ten years, and don’t ask anyone who has.
Evidently, that leaves Intel the majority of the market.
Remember, most people don't care as much as you or I. If they're going to buy a laptop to do taxes or web browsing or something, they will probably be mentally biased towards an Intel-based chip. Because it's been marketed for so long, AMD comparatively seems like a super new brand.
People miss this. A lot of people will only buy Intel. Businesses and IT departments rarely buy AMD, not just out of brand loyalty, but because of the software and hardware features Intel deploys that are catered to the business market.
This is in large part an OEM issue. Dell or HP will definitely have an Intel version of the machine you are looking for, but AMD versions are hit and miss.
I think this is partly because big OEMs doubt (used to doubt?) AMD’s ability to consistently deliver product in the kind of volume they need. Partly it’s because of Intel’s historically anticompetitive business practices.
>A lot of people will only buy Intel. Businesses and IT departments rarely buy AMD
That's because Intel bribed OEMs to use only Intel chips
Intel’s board is (or should be!) in exactly the right position to assess whether this dam is springing leaks. (It is.)
Last report I read it was ~80% (Intel) vs ~20% (AMD) for PC market. And ~75% (Intel) vs ~25% (AMD) for data center servers.
> And ~75% (Intel) vs ~25% (AMD) for data center servers.
IIRC their data center CPU revenue was about even this quarter so this is a bit deceptive (i.e. you can buy 1 large CPU instead of several cheaper ones).
Steam hardware survey: https://store.steampowered.com/hwsurvey/processormfg/
Windows:
Intel: 64.23%
AMD: 35.71%
Linux:
Intel: 30.15%
AMD: 69.85%
For PC’s that can’t be right. For overall consumer, Windows is at 25.75%, Linux is 1.43% and MacOS is at 5.53%.
Ignoring ChromeOS, and assuming 100% of windows and linux is x86 (decreasingly true - the only win11 I’ve ever seen is an arm VM on my mac) and 100% of Mac is arm (it will be moving forward), that puts arm at 20% of the PC market.
Interpolation from your numbers puts intel at 64% (with a ceiling of 80% of PC; 25% of consumer computing devices unless windows makes a comeback).
I dunno, I've seen more and more people referencing the crash bugs in the latest gens and how Intel lied about it through their teeth. And Intel having lost to Apple on the CPU front, never having caught up to Nvidia on the GPU front, and basically just not doing anything for the last decade certainly hasn't helped their reputation.
Let them die. Maybe we'd actually see some new competition?
I doubt many people are making purchasing decisions based on Intel branding. Any kind of speed advantage has not been a dominant factor in the minds of most low information/brand influenceable consumers who are buying x86 machines. Everybody else looks at reviews and benchmarks where Intel has to show up with a good product and their branding doesn't get them much.
>I feel like this is a mistake. Pat's strategy is aggressive but what the company needs.
He's also 63. Has plenty of money to survive the rest of his life. Has eight grandchildren. There's so much more to life than business. What's to say he doesn't want to simply enjoy life with more connection and community to loved ones around him?
That would be a healthy, balanced and long-term-oriented approach. But those who get to the level of CEO are subjected to intense forces that select against those traits.
I don’t know much about this guy but it’s reasonable to assume that any C-level exec will hold on to the position for dear life until they are forced out.
Here's more info on Pat. He is not your average CEO. He wrote this book almost 20 years ago. Why do you resort to conversation about assumptions?
The Juggling Act: Bringing Balance to Your Faith, Family, and Work
https://www.amazon.com/Juggling-Act-Bringing-Balance-Family/...
> any C-level exec will hold on to the position for dear life until they are forced out
I don't know. Frank Slootman's retirement from Snowflake earlier this year was certainly not celebrated by any significant stakeholders. I'd imagine at some point someone like Frank realizes that they are worth more than Tim Cook, they consider that they're in their mid-60s, and they decide the remaining time they have on earth might be better spent in other ways.
Every person in the workforce, no matter how ambitious or how senior, is forced into the calculus of money and power vs. good years remaining. I expect the rational ones will select the balance point for themselves.
That’s not at all who Pat is
True CEO retirements are announced in advance and do not lead to a strange co-interim CEO situation.
> What's to say he doesn't want to simply enjoy life
News: https://www.cnbc.com/2024/12/02/intel-ceo-pat-gelsinger-is-o...
Intel CEO Pat Gelsinger ousted by board
That "fabs will report to me" should be a tell that there is a lot of internal opposition...
Worse (for Intel) what can happen is Intel HP-isation - splits and sells.
But there is a lot of good news for them in cpu world: another B's granted, military buys Intel, new no-HT arch. And 80 bit memories like in Multics, can be true virtualisation on x86.
Even if x86 is dead Intel still have fabs - AMD can soon print in them :)
But that multigeneration refreshes are still a mistery - is it Intel's problem or maybe something else eg. simply someone have a some patent ? :>
He got fired, dawg. This is like being told that someone is sleeping with the fishes and concluding that the guy just finds lying in bed next to a bunch of sturgeon the most relaxing way to sleep.
> The moat of the two companies together would give the new combined company plenty of time to ramp up their fabs.
Doubt.
Neither of the companies is particularly competitive on either processor or GPU architecture nor fabrication.
A merger of those entities looks like nothing but a recipe for further x86 stagnation and an even quicker death for the entities involved imho.
In particular I cannot see what's good in it for AMD. The fabs have no use/clear path forward. Their processors/gpu either match our outmatch the Intel offering.
A Broadcom/Apple takeover of Intel sounds much more reasonable.
A Broadcom/Apple takeover of Intel sounds much more reasonable.
Out of curiosity, what would make Intel interesting to Apple? Apple already acquired Intel's modem business and they have their own CPU and GPU.
I think Apple has the cash, culture and management to make the fabs work.
But I'm just speculating.
Maybe for the fabs? It might be attractive for Apple to move production state side via Intels fabs but on the other hand I don't think Intels fabs can do what Apple wants
>In particular I cannot see what's good in it for AMD. The fabs have no use/clear path forward. Their processors/gpu either match our outmatch the Intel offering.
I can, but it's not technical. Intel has a huge advantage in several markets, and has strong relationships with many OEMs like Dell. Intel, even though their market cap is now a fraction of AMD's, still has a huge lead in marketshare in OEM systems and servers. (Several other posts in this thread have real numbers.)
If AMD bought out Intel, it would now get all of that, and be able to push all these OEM and server customers into AMD's solutions instead.
> is particularly competitive on either processor or GPU architecture nor fabrication.
Who is then? Apple is of course still ahead in lower power chips. But Apple is not in the the desktop/workstation/server market and there are hardly any alternatives to AMD or Intel there.
e.g. M2 Ultra Apple's fastest "desktop" CPU is slower than the 14700K you can get for $350. Seems pretty competitive...
> The moat of the two companies together would give the new combined company plenty of time to ramp up their fabs.
I'm not convinced of this. Fabs are incredibly expensive businesses. Intel has failed to keep up and AMD spun off their fabs to use TSMC.
There is also ARM knocking at the door for general computing. It's already gaining traction in previously x86 dominated markets.
The model for US based fabbing has to include selling large portions of capacity to third party ASIC manufacturers, otherwise I see it as doomed to failure.
> There is also ARM knocking at the door for general computing. It's already gaining traction in previously x86 dominated markets.
I know anecdotes aren't data, but I was talking with a colleague about chips recently and he noticed that converting all of his cloud JVM deployments to ARM machines both improved performance and lowered costs. The costs might not even be the chips themselves, but less power and thermal requirements that lowers the OpEx spend.
Yeah, my company is gearing up to do the same. We primarily use the JVM so doing the arm switcharoo only makes sense.
They would have at least 5 years to figure it out before ARM becomes viable on desktop assuming there continues to be movement in that direction. There is so little incentive to move away from x86 right now. The latest Intel mobile processors address the efficiency issues and prove that x86 can be efficient enough for laptops.
IT departments are not going to stop buying x86 processors until they absolutely are forced to. Gamers are not going to switch unless performance is actually better. There just isn't the incentive to switch.
> There is so little incentive to move away from x86 right now.
> IT departments are not going to stop buying x86 processors until they absolutely are forced to.
IT departments are buying arm laptops, Apple's.
And there is an incentive to switch, cost. If you are in AWS, you can save a pretty penny by adopting graviton processors.
Further, the only thing stopping handhelds from being arm machines is poor x86 emulation. A solvable problem with a small bit of hardware. (Only non-existent because current ARM vendors can't be bothered to add it and ARM hasn't standardized it).
Really the only reason arm is lagging is because the likes of Qualcomm have tunnel vision on what markets they want to address.
> IT departments are not going to stop buying x86 processors until they absolutely are forced to.
Plenty of them are buying Macbooks. It's definitely a small percentage of the worldwide market, but not entirely insignificant either.
> There is so little incentive to move away from x86 right now
Massively lower power consumption and way less waste heat to dispose of.
Literally the two biggest concerns of every data centre on earth.
> They would have at least 5 years to figure it out before ARM becomes viable on desktop assuming there continues to be movement in that direction.
What's this based on? Surely the proportion of desktops that need to be more powerful than anything Apple is doing on ARM is very small. And surely Apple isn't 5 years ahead?
Honestly, I wouldn't put it behind IBM to turn it around with POWER revival. They'd been doing some cool stuff recently with their NorthPole accelerator[1], and using 12nm process while at it, indicating there's much room for improvement. It could eventually become a relatively open, if not super affordable platform. There's precedent with OpenPOWER! And not to mention RISC-V, of course, championed by Jim Keller et al (Tenstorrent) but it's yet to blossom, all the while pppc64el is already there where it matters.
I say, diversity rules!
[1]: https://research.ibm.com/blog/northpole-llm-inference-result...
IBM did lay an egg with Power10, though. They cut corners and used proprietary IP and as a result there are few (are there any?) non-IBM Power10 systems because the other vendors stayed away. Raptor workstations and servers are a small-ish part of the market but they're comparatively highly visible - and they're still on POWER9 (no S1 yet).
They did realize the tactical error, so I'm hoping Power11 will reverse the damage.
PPC’s likely last hope died when Google didn’t go ahead with OpenPower.
Talos is the exception that proves the rule, sadly.
Friends that work at intel said gelsinger and the board have done EVERYTHING wrong in the past four years. From blowing key customer accounts to borderline malfeasance with payouts. It’s also the board that needs to go too for enabling. The merger with amd sounds like the right path.
US government wouldn't let intel down, this is matter of national security (only grown semiconductor fabs left on US soil) and edge of US tech dominance
When that happens typically the company starts optimizing for sucking money from the government. From the point of view of the consumer Intel would be finished.
That's the bet I made after the crash last summer. I think the USG only really cares about the fabs, as we've shown the ability to design better chips than intel's here. Time will tell if I'm right.
> only grown semiconductor fabs left on US soil
not sure what a "grown" semiconductor fab is but follow this link and sort by location https://en.wikipedia.org/wiki/List_of_semiconductor_fabricat... The number of homegrown companies with fabs is greater than 1
Since HN generally considers everything older than two nodes completely irrelevant, all of those fabs except for Intel are completely irrelevant.
okay, I forgot about texas instruments but lets not kidding ourselves here Intel is literally run pretty much (consumer and bussiness) the most here
it would create so much loses not just by security but also effect on economy would be so high
Pretty much fair game for speculation. The only way this is not bad for the tech industry was if he resigned due to medical or age reasons. That would not be unexpected.
Doubtful that is the issue with Intel's track record. Curious when we will know if 18A is competitive or not.
> If 18a is not ready I think the best case scenario for Intel is a merger with AMD. The US Govt would probably co-sign on it for national security concerns overriding the fact that it creates an absolute monopoly on x86 processors. The moat of the two companies together would give the new combined company plenty of time to ramp up their fabs.
No way other countries would allow that. If A-tel (Amd-inTEL) can not sell to the EU the merger will not happen.
This does not seem orderly/planned enough to be simple age-related.
I tend to agree. He's not outside the window where someone might choose to retire. But no named permanent successor? Retiring immediately? Tend to speak to a fairly sudden decision.
> If A-tel (Amd-inTEL) can not sell to the EU the merger will not happen.
What the EU gonna do then? Stop buying computers? Perform rapid continental ARM transition for mythical amount of money?
Just stop buying new intel chips, and continue buying Arm chips. Its not like every single existing x86 CPU would need to be taken away and destroyed.
Apple has made it fairly obvious, even if it was not already with smartphones and chromebooks, that Arm is a viable, realistic, and battle-tested alternative for general purpose computing. Windows 11 even runs on Arm already.
It would not happen "tomorrow" - this would be years in court if nothing else. This would give Dell/HP/Lenovo/whoever plenty of time to start building Arm laptops & servers etc for the European market.
And who knows what RISC-V will look like in a few more years?
The EU has done a bunch of stupid anti-consumer shit in tech already (hello cookie warnings that everyone now ignores), so I would not be surprised if this happened.
> What the EU gonna do then?
Seize or terminate their patents and copyrights. Issue arrest warrants for criminal evasion. Compulsory licensing of x86 to a European design firm immunized by EU law.
> Perform rapid continental ARM transition
Yes.
Windows is on ARM. Apple is on ARM. AWS and Ampere make decent ARM servers. You have decent x86 user-space compatibility on ARM laptops. That is all users want.
I doubt it will cost 'mythical amounts of money'. Most users use a web browser and an office suite. I doubt they will know a difference for a while.
Hard to imagine it ever coming to that but presumably massive fines?
> Perform rapid continental ARM transition for mythical amount of money?
And what is Intel + AMD going to do? Not sell CPUs in Europe?
People who are presumably very well-off financially can retire a tad on the early side for all sorts of reasons or a combination thereof. Certainly he has made some significant course corrections at Intel but the most charitable thing one can say is that they will take a long time to play out. As you say, a merger with AMD seems like a non-starter for a variety of reasons.
Is Intel really in such a dire situation that they need to merge with AMD? AMD has been in troubling situation in the past(some thanks to Intel illegal dealings) yet they managed to survive and they were nowhere near Intel's size.
More importantly, AMD troubles made them refocus and improve their products to the levels we're seeing today.
Giving life support to dinosaurs isn't how you create a competitive economy.
I think that Pats strategy is what the fab needs to do to be successful.
However, I think the fab and design should be separate companies, with separate accountability and goals/objectives. There is just too much baggage by keeping them coupled. It doesn't let either part of the company spread their wings and reach their full potential when they are attached at the hip. From the outside perspective, that is the thing that Pat has seemingly been focused on, keeping it together, and its why people have lost faith in his leadership.
I also don't think that from a investment / stock standpoint that accelerating the depreciation / losses related to restructuring on the most recent quarter was a wise decision, since what Intel really needed was a huge win right now.
Nope.
Look at what Pat did to VMware. He's doing the exact same thing at Intel. He came in, muddied the waters by hiring way too many people to do way too many things and none of them got done appropriately. Pat is a huge part of the problem.
I had the unfortunate pleasure of watching him not understand, at all, VMware's core competency. It was a nightmare of misunderstanding and waste in that company under his leadership.
Intel turned into even more of a laughing stock under Gelsinger. I say: good riddance. He burned time, capital and people at both VMware and Intel. He's a cancer as a CEO.
When he came back to Intel I was saying that all this ‘finally an engineer’ in charge stuff was misunderstanding Pat Gelsinger and VMWare was front and center in my thinking that.
Do you have more details or a reference for the VMware activity? The wikipedia VMware article is pretty brief
Given the push of ARM designs into the desktop and server space, that monopoly doesn't seem to me as much of a danger as it might have a decade ago. I imagine any anti-competitive behavior in x86 would only accelerate that trend. Not that monopolies shouldn't be a concern at all, but my thought is that it's not quite that large of a danger.
If a breakup is in the works for Intel, merger of the foundry side with Global Foundries would make more sense than AMD. Intel's foundries, even in the state they're in would likely be a step up for GF. And given the political sensitiveness, GF already has DoD contracts for producing chips.
Didn't the US government just give Intel $7b on the condition they don't spin off the foundry business?
https://finance.yahoo.com/news/intels-7-86-billion-subsidy-0...
I think you would have to go through the specific conditions - they put up restrictive conditions but does not seem impossible to work with
Both of those things might be true, but to me, it looks more like the board is acting out of fear of shareholder lawsuits. Pat's strategy has significantly destroyed value because the market lacks visibility.
Dropping Pat will alleviate their feeling of having to do "something."
As for M&A, it wouldn't just have to be approved at the DoJ. And the Chinese will never ever approve of it (but would have to). If they do a transaction without approval from the CMA, it would be like a nuclear financial war.
I think it's high time to gut intel into parts, a la GE. Sell the Fabless to QCOM or BCOM. Sell the Fabs of one by one to GF, Tower, UMC or even tsmc. Find a PE firm for the leading edge and reconstrue it with significant rnd credits as a kind of Bell labs 2.0.
Or something like that.
> I think the best case scenario for Intel is a merger with AMD (…) it creates an absolute monopoly on x86 processors
If this happens, couldn’t they force them giving out licenses as a condition? The licensing thing has been such an impediment to competition that it seems like it’s about time anyway.
It's a good point, and yes, afaik any redress can be requested as a condition for blessing a merger.
One argument I've hard in favor of the split is this: If you are AMD/NVDA/other top player, do you want to send your IP to an Intel owned fab for production?
At least in theory, a fully independent, split/spun out standalone fab removes this concern.
That said - what does Intel have to offer the top players here? Their fabs are being state of the art. And what's the standalone value of post-spin fabless Intel if their chip designs are as behind as their fabs?
This certainly presents a conundrum for US policy since we need fabs domestically for national security reasons, but the domestically owned ones are behind.
Eh, firewalls can be made strong enough, at least for some things. A software parallel is: you are Apple / Facebook, do you use Azure and/or AWS? I wouldn't, if it were me, but they do.
Azure/AWS is cloud/B2B, AAPL/FB are B2C consumer goods/services. Different customers, different industries. There is some overlap, but moat is in other places.
AMD/NVIDIA are in same business as Intel and have same pool of customers.
Maybe but I think AWS arguably is a different scale of automation. There’s no Amazon human in the loop looking at your source. Sure a human SRE could peak into your system but that’s something of an exception.
I can’t imagine fabs have that level of automation. It’s not like sending a file to your printer. It’s a multi month or year project in some cases to get your design produced. There’s many humans involved surely.
> ...the best case scenario for Intel is a merger with AMD...
Why would AMD want a merger? They aren't a charity, and certainly don't need the distraction.
Well, for at least a time they would have the entire x86 market. That is not nothing. Also AMD may want to get back into the fab business. Without competition in x86 why not use Intel's fabs?
They dont need to merge with intel to get the entire x86 market, they'll be getting that anyway if Intel folds.
Even if Intel gets bought out, it'll be in pieces. Nobody wants to enter the x86 market, but there may be smaller segmenrs of the business that can help an ARM based business, or someone looking to get into GPU's.
And everyone would rush to migrate away from x86.
Having a sole supplier for CPUs is a bad strategy.
They would technically have no market, because the Intel-AMD X86 license is non-transferable and expires if one party goes out of business.
Why would the US govt allow a merge with AMD?
Sure they won't allow Intel to be bought by a foreign company, but surely everyone would much rather see Intel being bought by literally any other company than AMD and Nvidia.
Nvidia makes a lot more sense than AMD; it is better for the market (preserving some competition in x86), and at least Intel does something Nvidia doesn’t.
China and the EU would never allow an Nvidia Intel merger, not under any scenario the US would find acceptable.
They'll barely allow Nvidia to acquire anybody at this point, no matter how small. See recent EU response to Run:ai. Intel would be considered 100x worse.
What do they do that Nvidia doesn't (and that Nvidia would care about)?
They already do networking, photonics, GPUs, high speed interconnects, and CPUs. They are planning on selling their FPGAs (the Altera acquisition) to Lattice.
The only things left are their fab ops, thunderbolt/usbc, wifi, and ble.
Their fab ops would take over a decade of heavy investment to catch up to TSMC or Samsung and idk if even Nvidia is ambitious enough to take that on.
Wifi and BLE could be good additions if they wanted to branch out their mellanox portfolio to wireless. Thunderbolt/USB C also might be worthwhile.
But that IP is probably going to be cheaper to buy piecemeal so idk if it's worth it to buy the whole company outright.
That would be a hard call
One of the reasons where Intel "let" AMD compete in the x86 space is US Gov requirements for being able to source chips from two vendors at least
Maybe they'll sell Intel to Northrop Grumman /hj
Boeing should buy Intel the way they bought McDonald Douglass. It's gonna be a success, trust me.
Probably should add GMC in there.
I heard they’re building a iOS / android replacement. Think of the vertical integration!
I’m picturing a boot-looping cargo plane full of hummers dropping like a stone — doesn’t get much more vertically integrated than that. Think of all the layers they can eliminate.
What other US companies are equipped and interested in running a giant chip design/fab? NVIDIA and AMD are likely the only two candidates.
I do not at all think it will happen, nor does it make any sense at all but the rumours of Apple seemingly being interested in buying out Intel dont seem to be going away.
I can see them wanting certain parts of the business (GPU mainly) but on a whole it doesn't make a lot of sense.
I don't see Intel as a single entity being valuable to any US business really. You're essentially buying last years fall line, theres very little use for Intel's fabs without a huge amount being spent on them to get them up to modern standards.
It'll all come down to IP and people that'll be the true value.
Micron Technology is the only one that comes to mind, but they are more on the memory side of things - the last time they were on level with Intel was in the 90s when they both made DRAM but intel pivoted to processors and networking
There are also options like Texas Instruments or Microchip. Of course far more unlikely than either nvidia or amd, but definitely options.
Apple is the obvious one. Essentially the only place with both the capital to do it and the extreme vertical integration enthusiasm. AMD hopefully still remembers running a fab.
Ford, GM... The big automakers got burned with the chip shortage after COVID (this is their fault, but still they got burned)
Merger with AMD is very unlikely for competitive reasons but I’ve read some rumors that 1) Apple will push some production to Ιntel from TSMC and 2) Apple (and Samsung) are considering buying Intel.
>the best case scenario for Intel is a merger with AMD
Oh man, the risk in that is extreme. We are moving away from x86 in general, but wow, that's... a big jump in risk.
And really, AMD spun off Global Foundries. AMD doesn't want to run a fab.
Does this mean that Intel's fabs should split for Global Foundries, and the Intel design team should go to AMD?
I seem to recall that Intel was talking about the same kind of split. Maybe the Intel child company and AMDs would merge, or maybe they'll stay separate and the parents will merge?
Sure, let's create a monopoly around one of the most valuable commodities in the world.
What could go wrong?
I think Apple Silicon has shown us that x86 doesn't have the monopoly potential it once had.
Apple Silicon was designed to be efficient at emulating x86-64.
If you take that away, it becomes irrelevant (like many other ARM-based processors that struggle to be a good product because of compatibility).
Apple has a promising path for x86 liberation, but it is not there yet.
It's so much worse. They put a CFO and a Marketing/Sales professional in charge.
> ...then I don't think Pat gets retired.
This implies that he was pushed out, rather than chose to retire. I can't see anything in the article to suggest this, do you have another source?
Merging AMD and Intel sounds like Penn Central all over again
> I think the best case scenario for Intel is a merger with AMD
I think a merger with Nvidia would be more likely given the antitrust issues that a merger with AMD would bring up.
That assumes a functional antitrust mechanism. We don't know what the next admin will do yet other than attempt technically illegal revenge on people they hate.
Hell no, I don't want the Intel management structures coming over here. Qualcomm is welcome to them.
agree with most except merging usually costs more than less, it usually happens to stack up because you get to axe a lot of employees in the name of synergy I mean duplicated departments
> the best case scenario for Intel
That's the best exit case for the shareholders. It's the worst case for Intel's employees, customers and partners.
> would probably co-sign on it for national security concerns
This is equally laughable and detestable at this point in history. My personal security is not impacted by this at all. Weapons manufacturers honestly should not have a seat at this discussion.
> overriding the fact that it creates an absolute monopoly on x86 processors.
Yet this isn't a problem for "national security?" This is why I find these sentiments completely ridiculous fabianesque nonsense.
>I think the best case scenario for Intel is a merger with AMD
Oh no no no no. Oh hell no. For now, we need competition in the x86 market, and that would kill it dead. Imagine Intel re-releasing the Core 2 Quad, forever.
The problem is that this transition is very capital intensive and all the money was spent on share buybacks the past decades. The stock market looks at CPUs and GPUs and likes the latter a lot more so no fresh money from there. At the moment the only stakeholder with a vital interest in Intel semiconductor capabilities is the US government and even that may change as a result of Trump.
He did not get Larrabee flying but yeah it was after him that intel did not put the effort into what was needed.
Nonetheless his comment on nvidia being lucky was everything than a smart comment.
[dead]
[flagged]
The mistake Pat Gelsinger made was that he put his faith in the American manufacturing workforce. Very sad.
Nope. From what i heard from VMWare he was just a bad manager. He seems to be just a person skillfully playing org game, yet when it came to deliver he just flopped.
The struggling companies with totally rotten management like to bring such "stars" (pretty shrewd people who built themselves a cute public image of supposedly talented engineers who got promoted into higher management on their merits) - Yahoo/Meyers come to mind as another example - who de-facto complete the destruction of the company while the management rides the gravy train.
I worked for 3 months for Intel. I can genuinely say that there is no saving that company. Recently, they are hiring many PhDs from various US universities (particularly Indians) to try to compensate (they offer generous stocks and are hiring like crazy right now). There are two major problems I saw: lack of genuine interest in fabs (most people are there for the Intel name and then leave or in the case of Indians, they are there for Visa purposes. Mind you, we were not allowed to hire people from China since Intel is subject to Export laws). The biggest problem by far is lack of talent. Most of the talent I know is either at Apple or Facebook/Google, including those trained in hardware. Intel is bound to crumble, so I hope we as taxpayers don't foot the bill. There was unwillingness to innovate and almost everyone wanted to maintain the status quo. This might work in traditional manufacturing (think tennis rackets, furniture...), but fabs must improve their lithography manufacturing nodes or they get eaten by the competition
A few years ago a CEO of Intel (not Gelsinger) said something like "our CPU architecture is so well-known every college student can work on it". A friend working at Intel at that time translated it to me: "we will hire cheap students to work on chips and we will let the expensive engineers leave". At least in his department the senior engineers left, they were replaced by fresh graduates. It did not work, that department closed. I have no idea how widespread this was inside Intel, but I saw it in other big companies, in some with my own eyes.
Funny you mention this. In my brief stint there, I saw a fresh college graduate get promoted to a lead for a Scanning Unit simply because the current lead was retiring (I was actually offered that position, but I was leaving and turned it down). They were trained in less than a month by shadowing the lead on-the-verge-of-retirement. The engineer who got promoted was at Intel less than a year, and had no prior internship experience (they were hired in 2021 when chips were in desperate need of talent. You might recall the chip shortage that affected cars etc.)
I know some people who understand x86 [0] very well. Most of them do not work at Intel. Those that do tend to be on the OSS side and don’t really have any sway in the parts of Intel that design new hardware.
And this is a problem! Most of Intel’s recent major architectural changes over the last decade or so have been flops. [1] Quite a few have been reverted, often even after they ship. I presume that Intel does not actually have many people who are really qualified to work on the architecture.
[0] I’m talking about how it interacts with an OS and how you put it together into a working system. Understand stuff like SIMD is a different thing.
[1] AVX and its successors are notable exceptions, but they still have issues, and Intel did not really rock the early implementations.
> A few years ago a CEO of Intel (not Gelsinger) said something like "our CPU architecture is so well-known every college student can work on it". A friend working at Intel at that time translated it to me: "we will hire cheap students to work on chips and we will let the expensive engineers leave"
Reminds me of the Boeing managers saying that they didn't need senior engineers because its products were mature..
A few blown doors and deadly crashes later, that didn't age well..
> Most of the talent I know is either at Apple or Facebook/Google
A relative of mine with a PhD sounds exactly like this. Worked for Intel on chip-production tech then was hired by Apple about 10 years ago, working on stuff that gets mentioned in Apple keynotes.
While I am sure the foot soldier quality is important, we ought to put the burden on leadership a bit more. I am not sure AMD had a better talent pool (I don't work in the industry so I don't know!) ten years ago. Culture is predominantly shaped by those already in power -- it sounds like they need a change in culture.
> lack of genuine interest in fabs - most people are there for the Intel name
I can actually believe this. Of most of the rest of the arguments, that tend to be rather vague, and wave at implementation, or some stock related motivation (like we need TSMCs business), a lack of genuine interest in the employees that was not sold to them or the market especially effectively seems fairly believable.
Most people are there for the chips, for making great designs in silicon, and being market leaders in CPU architecture. Not running the equivalent of an offshoring division making other people's stuff.
The entire implementation has seemed rather haphazard and not sold with much real motivation. Honestly, the entire situation feels a bit like Afghanistan (if that's a bit incendiary)
Nobody really knows why they're going. Nobody really knows what they're trying to accomplish. The objectives on the ground seem vague, ephemeral, and constantly changing. There's not much passion among the ground troops about the idea. The leaders always seem to be far away, and making strange proclamations, without actually putting boots in the dirt. The information released often feels multiple personalityish, like there's far too many cooks in the kitchen, or far too many puppeteers pulling strings in every direction. And afterward you find out it was mostly some dumpster fire driven by completely different motivations than what were publicly acknowledged.
The senior engineers I saw there are talented. And Intel has benefits and stock packages that rival those of big tech. I think I can expand on your point by saying the more senior engineers were risk averse and on the verge of retirement, and the young engineers were just there for the Intel name or some other reason. There is surprisingly very few middle-aged long-term people down there. This would be expected in software (Facebook/Google), but it is a recipe for disaster in hardware where long term thinking is critical to advance lithography (changed don't happen overnight). I also was surprised by how few of Intel's engineers believed in Intel. The most stark observation I made was senior engineers would max their stock purchase plan, but many young engineers would abstain. If the engineers don't believe in the product they are working on, I don't accept that the gov. must bail it out. I hope some investigative journalist writes a book on Intel and Boeing someday, as I would be curious as to how things unfolded and got to this point. There are many similarities (I never worked for Boeing, but have friends in Seattle that describe the culture in similar terms to what I saw at Intel). Also, to your last point, the Intel name does not hold as much weight as it did in the Grove days.
> And Intel has benefits and stock packages that rival those of big tech.
Given Intel’s stock returns over the past 15 years, Intel would have to offer insane cash compensation to rival big tech.
Levels.fyi indicates Intel heavily underpays, which is what I would expect.
https://www.levels.fyi/?compare=Intel,Apple,Google&track=Sof...
Like I mentioned down below, used to work with the space agency back in the day, and by extension, Boeing. Even late 2000's, early 2010's, Boeing was a dumpster fire. Visited their local offices and the place looked like a hoarder hole. Boxes thrown everywhere, haphazard cabinets just left places, roaming meetings in sparse rooms. Seemed like homeless were camping there rather than a functional company.
The meetings with them felt that way too. Watch the same slides for months and wonder how nobody was moving anywhere with anything actually involving choices. "So, we've got these three alternatives for SLS, and like 10 other nice-to-have engineer pipe dreams." "Right, you'll choose the obvious low cost choice A. Why are we having this meeting again?" Many months later, after endless dithering, surprise, obvious choice using almost no hardware changes is chosen. All engineer nice-to-have pipe dreams are thrown away. There was much rejoicing at the money successfully squandered.
Do you think Intel can improve the quality of engineers on the fab side by offering PhD level salaries to people with BS/MS from good graduates from good US schools? I suspect that Intel hires PhDs from subpar universities.
Don't want to dox myself, but I worked at one of the fabs (Fab11X)
Your two major problems would both be solved by paying more, but it sounds like they are paying well, according to you?
Genuine interest is not the only way to get great results. Excellent pay can do so as well.
And lack of talent, again, excellent pay.
Got a link?
> Most of the talent I know is either at Apple or Facebook/Google
That's a damn shame. Big tech monopolies are screwing up the talent market. Nobody can match their comp and it's bullshit
The problem is that Intel is poorly run. Intel should be printing money, and did for a long time until a string of bad leadership. If they had brought in Gelsinger after Otellini, which they were reported to have considered, the company might be in a much better position.
But alas, Intel is a mega bureaucratic company with a few tiny siloed teams responsible for innovating and everyone else being test and process jockeys. I think Gelsinger wasn't given enough time, personally, and agree with the OP that even someone like Elon would struggle to keep this sinking ship afloat at this point.
BK wanted wearables and Bob Swan wanted to cut costs; neither of them were visionaries nor did they really understand that Intel was a hard tech company. Intel had achieved such dominance in such a in-demand, growing market, that all they had to do was make the technology better (smaller, faster, lower power) and the money would continue to flow. The mandate was straightforward and they failed.
The companies that are big tech today took the risks and are now reaping the rewards. Intel decided not to take the risks, so now it doesn’t reap the rewards.
No one stopped Intel from paying for talent. Intel’s shareholders decided to pay themselves instead of investing in the business by hiring great employees. That’s how you go from being a leader to a laggard.
I am sure Intel has enough cash to hire some good talent (like just offer talented people next-level salary at $FAANG), the problem is deeper in the hiring pipeline -- convincing people of the budget and actually scouting and retaining good people
Difficult to see how this is anything other than a failure. I had high hopes when Gelsinger returned, but it seems that BK had done too much damage and Gelsinger didn't really get a grip on things. One of the things I heard that really highlighted the failure of this recovery was that BK had let Intel balloon in all sorts of ways that needed to be pared back and refocused, but head count under Gelsinger didn't just stay static but continued to significantly grow. It's no good giving the same politically poisonous, under-delivering middle management more and more resources to fail at the same job. They really need to clear house in a spectacular way but I'm not sure who could even do that at this point.
They have made too many bad choices, and everyone else has been eating their lunch for the last few years. They are a non-factor in the mobile space where ARM architectures dominate. They are a non-factor in the GPU market where NVDA dominates ahead of AMD. They were focused heavily on data centres and desktop/laptop CPUs where ARM is also increasingly making inroads with more efficient designs that deliver comparable performance. They are still struggling with their fab processes, and even they don't have enough money to make the investment needed to catch back up to TSMC. There is a good reason that even Global Foundries has given up on the bleeding edge some time ago.
They are in a deep hole, and it is difficult to see a future where they can restore their former glory in the foreseeable future.
> where ARM is also increasingly making inroads with more efficient designs that deliver comparable performance.
ARM isn't doing any such thing. Apple & Qualcomm are, though. ARM itself if anything looks weak. Their mobile cores have stagnated, their laptop attempts complete failures, and while there's hints of life in data center it also seems mediocre overall.
This feels a bit pedantic. ARM-the-CPU-architecture is increasingly making inroads with more efficient designs that deliver comparable performance to Intel's x86 chips, thanks to Apple and Qualcomm. ARM-the-holding-company is not doing that, they just make mediocre designs and own IP.
Don't forget their "brilliant" strategy of suing their own customers.
The US government wants then making SOTA semiconductors. Biden wanted it and I highly suspect Trump will want it too.
last few years.
It's pretty much two decades at this point.
Apple saw the writing on the wall and bailed years ago, built their own chips and denied them a large customer.
It’s a really bad sign when a customer decides it can out innovate you by internally copying your entire business and production line.
I would argue there were many good things but not well delivered. The Optane persistent memory should've been revolutionary for databases but Intel just put it out and expected people to do all the software.
I'm seeing the same thing now with Intel QAT / IAA / DSA. Only niche software support. Only AWS seems to have it and those "bare metal" machines don't even have local NVMe.
About 10 years ago Intel Research was publishing a lot of great research but no software for the users.
Contrast it with Nvidia and their amazing software stack and support for their hardware.
When I read the Intel QAT / IAA / DSA whitepaper I knew it was the beginning of the end for Intel.
Every aspect of that document was just dripping in corporate dinosaur / MBA practices.
For example, they include 4 cores of these accelerators in most of their Xeons, but soft fuse them off unless you buy a license.
Nobody is going to buy that license. Okay, maybe one or two hyperscalers, but nobody else for certain.
It's ultra-important with a feature like this to make it available to everybody, so that software is written to utilise it. This includes the starving university student contributing to Postgres, not just some big-enterprise customer that merely buys their software!
They're doing the same stupid "gating" with AVX-512 as well, where it's physically included in desktop chips, but it is fused off so that server parts can be "differentiated".
Meanwhile AMD just makes one compute tile that has a uniform programming API across both desktop and server chips. This means that geeks tuning their software to run on their own PCs are inadvertently optimising them for AMD's server chips as well!
PS: Microsoft figured this out a while ago and they fixed some of their products like SQL Server. It now enables practically all features in all SKUs. Previously when only Enterprise Edition has certain programmability features nobody would use them because software vendors couldn't write software that customers couldn't install because they only had Standard Edition!
> Nvidia and their amazing software stack and support for their hardware.
Linus seems to disagree https://m.youtube.com/watch?v=tQIdxbWhHSM
Oh, yes. They spent too many years as the obvious #1, with a license to print money...when, longer-term, staying there required that Intel remain top-of-league in two profoundly difficult and fast-moving technologies (10e9+-transistor CPU design, and manufacturing such chips). Meanwhile, the natural rot of any large org - people getting promoted for their ladder-climbing skills, and making decisions for their own short-term benefit - were slow eating away at Intel's ability to stay at the top of those leagues.
Intel needed to be split in two as well, which Gelsinger only half-heartedly did. He split the company into two functions - foundry and design, but didn't take that to its logical conclusion and split up the company completely.
> Intel needed to be split in two as well
Wouldn't that just pretty much guarantee that the foundry business would fail since Intel wouldn't have any incentives to shift most of their manufacturing to TSMC? The same thing happened with AMD/Global Foundries..
Agree with OP that Intel was probably too deep into its downward spiral. While it seems Pat tried to make changes, including expanding into GPUs, it either wasn't enough or too much for the Intel board.
Splitting Intel is necessary but probably infeasible at this point in the game. Simple fact is that Intel Foundry Services has nothing to offer against the likes of TSMC and Samsung - perhaps only cheaper prices and even then it's unproven to fab any non-Intel chips. So the only way to keep it afloat is by continuing to fab Intel's own designs, until 18A node becomes viable/ready.
Agreed.
He should have cut 25% of the workforce to get started (and killed the dividend).
Also - the European expansion and Ohio projects, while good short-term strategies, were too early.
Cut the ship down to size and force existing sites to refocus or be closed. Get alarmist. Make sure you cut out all the bad apples. Prune the tree. Be ruthless and determined.
They should hire Reed Hastings now. He's the OG turnaround king.
Who/what is BK? Are they the previous person who held Pat Gelsingers position?
They really need to clear house in a spectacular way but I'm not sure who could even do that at this point.
An alien from Vega looking at our constellation of tech companies and their leadership might point at an obvious answer…