Nvidia buys $5B in Intel

2025-09-1811:041018616www.tomshardware.com

Cats and Dogs, living together!

In a surprising announcement that finds two long-time rivals working together, Nvidia and Intel announced today that the companies would jointly develop multiple new generations of x86 products together. The products include x86 Intel CPUs tightly fused with an Nvidia RTX graphics chiplet for the consumer gaming PC market, named the ‘Intel x86 RTX SOCs.’ Nvidia will also have Intel build custom x86 data center CPUs for its AI products for hyperscale and enterprise customers. Additionally, Nvidia announced that it will buy $5 billion in Intel common stock at $23.28 per share. We spoke with Nvidia representatives to learn more details about the company’s plans.

Nvidia says that the partnership between the two companies is in the very early stages, so the timeline for product releases, along with any product specifications, will be disclosed at a later, unspecified date. (Given the traditionally long lead-times for new processors, it is rational to expect these products will take at least a year, and likely longer, to come to market.)

Nvidia emphasized that the companies are committed to multi-generation roadmaps for the co-developed products, which represents a strong investment in the x86 ecosystem. However, Nvidia tells us it also remains fully committed to its other announced product roadmaps and architectures, including for its Arm-based GB10 Grace Blackwell processors for workstations and the Nvidia Grace CPUs for data centers and the next-gen Vera CPUs. Nvidia says it also remains committed to products on its internal roadmaps that haven’t been publicly disclosed yet, indicating that the new roadmap with Intel will merely be additive to its existing initiatives.

Nvidia hasn’t disclosed whether it will use Intel Foundry to produce any of the products yet. However, while Intel has used TSMC to manufacture some of its recent products, its goal is to bring production of most of its high-performance products back into its own foundries, and some of its products never left. For instance, Intel’s existing Granite Rapids data center processors use the ‘Intel 3’ node, and the upcoming Clearwater Forest Xeons will use Intel’s own 18A process node for compute. This suggests that at least some of the Nvidia-custom x86 silicon, particularly for the data center, could be fabbed on Intel nodes. However, Intel also uses TSMC to fabricate many of its client x86 processors now, so we won’t know for sure until official announcements are made — particularly for the RTX GPU chiplet.

While the two companies have engaged in heated competition in some market segments, Intel and Nvidia have partnered for decades, ensuring interoperability between their hardware and software for products spanning both the client and data center markets. However, these products have long used the PCIe interface to connect Intel CPUs and Nvidia’s GPUs. The new partnership will find tighter integration using the NVLink interface for CPU-to-GPU communication, which affords up to 14 times more bandwidth along with lower latency than PCIe, thus granting the new x86 products access to the highest performance possible when paired with GPUs. Let’s dive into the details we’ve learned so far.

Intel x86 RTX SOCs for the PC gaming market

For the PC market, the Intel x86 RTX SoC chips will come with an x86 CPU chiplet tightly connected with an Nvidia RTX GPU chiplet via the NVLink interface. This type of processor will have both the CPU and GPU units merged into one compact chip package that externally looks much like a standard CPU, rivaling AMD’s competing APU products.

This type of tight integration packs all the gaming prowess into one package without an external discrete GPU, providing power and footprint advantages. As such, these chips will be heavily focused on thin-and-light gaming laptops and small form-factor PCs, much like today’s APUs from AMD. However, it’s possible the new Nvidia/Intel chips could come in multiple flavors and permeate further into the Intel stack over time.

Intel has worked on a similar type of chip before with AMD; however, there is at least one significant technical difference between these initiatives. Intel launched its Kaby Lake-G chip in 2017 with an Intel processor fused into the same package with an AMD Radeon GPU chiplet, much the same as the description of the new Nvidia/Intel chips. You can see an image of the Intel/AMD chip below.

This SoC had a CPU at one end connected via a PCIe connection to the separate AMD GPU chiplet, which is flanked by a small, dedicated memory package. This separate memory package was only usable by the GPU. The Nvidia/Intel products will have an RTX GPU chiplet connected to the CPU chiplet via the faster and more efficient NVLink interface, and we’re told it will have uniform memory access (UMA), meaning both the CPU and GPU will be able to access the same pool of memory.

Intel notoriously axed the Kaby Lake-G products in 2019, and the existing systems were left without proper driver support for quite some time, in part because Intel was responsible for validating the drivers, and then finger-pointing ensued. We’re told that both Intel and Nvidia will be responsible for their respective drivers for the new models, with Nvidia naturally providing its own GPU drivers. However, Intel will build and sell the consumer processors.

We haven’t spoken with Intel yet, but the limited scope of this project means that Intel’s proprietary Xe graphics architecture will most assuredly live on as the primary integrated GPU (iGPU) for its mass-market products.

Nvidia's first x86 data center CPUs

Intel will fabricate custom x86 data center CPUs for Nvidia, which Nvidia will then sell as its own products to enterprise and data center customers. However, the entirety and extent of the modification are currently unknown. We do know that Nvidia will employ its NVLink interface, which tells us the chips could leverage Nvidia’s new NVLink Fusion tech that enables custom CPUs and accelerators to enable faster, more efficient communication with Nvidia’s GPUs than found with the PCIe interface.

NVLink Fusion

(Image credit: Nvidia)

Intel has long offered custom Xeons to its customers, primarily hyperscalers, often with relatively minor tweaks to clock rates, cache capacities, and other specifications. In fact, these mostly slightly-modified custom Xeon models once comprised more than 50% of Intel’s Xeon shipments. Intel has endured several years of market share erosion due to AMD’s advances, most acutely in the hyperscale market. Therefore, it is unclear if the 50% number still holds true, as hyperscalers were the primary customers for custom models.

Intel has long said that it will design completely custom x86 chips for customers as part of its IDM 2.0 strategy. However, aside from a recent announcement of custom AWS chips that sound like the slightly modified Xeons mentioned above, we haven’t heard of any large-scale uptake for significantly modified custom x86 processors. Intel announced a new custom chip design unit just two weeks ago, so it will be interesting to learn the extent of the customization for Nvidia’s x86 data center CPUs.

Nvidia already uses Intel’s Xeons in several of its systems, like the Nvidia DGX B300, but these systems still use the PCIe interface to communicate with the CPU. Intel’s new collaboration with Nvidia will obviously open up new opportunities, given the tighter integration with NVLink and all the advantages it brings with it. The likelihood of AMD adopting NVLink Fusion is somewhere around zero, as the company is heavily invested in its own Infinity Fabric (XGMI) and Ultra Accelerator Link (UALink) initiatives, which aim to provide an open-standard interconnect to rival NVLink and democratize rack-scale interconnect technologies. Intel is also a member of UALink, which uses AMD’s Infinity Fabric protocol as the foundation.

Dollar and Cents, Geopolitics

Nvidia’s $5 billion purchase of Intel common stock will come at $23.28 a share, roughly 6% below the current market value, but several aspects of this investment remain unclear. Nvidia hasn’t stated whether it will have a seat on the board (which is unlikely) or how it will vote on matters requiring shareholder approval. It is also unclear if Intel will issue new stock (primary issuance) for Nvidia to purchase, as it did when the U.S. government recently became an Intel shareholder (that is likely). Naturally, the investment is subject to approval from regulators.

Nvidia’s buy-in comes on the heels of the U.S government buying $10 billion of newly-created Intel stock, granting the country a 9.9% ownership stake at $20.47 per share. The U.S. government won’t have a seat on the board and agreed to vote with Intel’s board on matters requiring shareholder approval “with limited exceptions.” Softbank has also recently purchased $2 billion worth of primary issuance Intel stock at $23 per share.

Swipe to scroll horizontally

Purchases of Intel Stock
Row 0 - Cell 0

Total

Share Price

Nvidia

$5 Billion

$23.28

U.S. Government

$9 Billion

$20.47

Softbank

$2 Billion

$23

The U.S. government says it invested in Intel with the goal of bolstering US technology, manufacturing, and national security, and the investments from the private sector also help solidify the struggling Intel. Altogether, these investments represent a significant cash influx for Intel as it attempts to maintain the heavy cap-ex investments required to compete with TSMC, all while struggling with a negative amount of free cash flow.

“AI is powering a new industrial revolution and reinventing every layer of the computing stack — from silicon to systems to software. At the heart of this reinvention is Nvidia’s CUDA architecture,” said Nvidia CEO Jensen Huang. “This historic collaboration tightly couples NVIDIA’s AI and accelerated computing stack with Intel’s CPUs and the vast x86 ecosystem—a fusion of two world-class platforms. Together, we will expand our ecosystems and lay the foundation for the next era of computing.”

“Intel’s x86 architecture has been foundational to modern computing for decades – and we are innovating across our portfolio to enable the workloads of the future,” said Intel CEO Lip-Bu Tan. “Intel’s leading data center and client computing platforms, combined with our process technology, manufacturing and advanced packaging capabilities, will complement Nvidia's AI and accelerated computing leadership to enable new breakthroughs for the industry. We appreciate the confidence Jensen and the Nvidia team have placed in us with their investment and look forward to the work ahead as we innovate for customers and grow our business.”

We’ll learn more details of the new partnership later today when Nvidia CEO Jensen Huang and Intel CEO Lip-Bu Tan hold a webcast press conference at 10 am PT.

This is breaking news…more to come.

Follow Tom's Hardware on Google News, or add us as a preferred source, to get our up-to-date news, analysis, and reviews in your feeds. Make sure to click the Follow button!


Read the original article

Comments

  • By evanjrowley 2025-09-1816:2225 reply

    Nvidia's stake in Intel could have terrible consequences. First, it is in Nvidia's interest to kill Intel's Arc graphics, and that would be very bad because it is the only thing brighing GPU prices down for consumers. Second, the death of Intel graphics / Arc would be extremely bad for Linux, because Intel's approach to GPU drivers is the best for compatibility, wheras Nvidia is actively hostile to drivers on Linux. Third, Intel is the only company marketing consumer-grade graphics virtualization (SR-IOV), and the loss of that would make Nvidia's enterprise chips the only game in town, meaning the average consumer gets less performance, less flexibility, and less security on their computers.

    • By ho_schi 2025-09-1821:0413 reply

      Conclusion: Buy AMD. Excellent Linux support with in-tree drivers. For 15 years! A bug is something which will be fixed.

      Nvidias GPUs are theoretically fast on initial benchmarks. But that’s mostly optimization by others for Nvidia? That’s it.

      Everything Nvidia has done is a pain. Closed-source drivers (old pain), out of tree-drivers (new pain), ignoring (or actively harming) Wayland (everyone handles implicit sync well, except Nvidia which required explicit sync[1]), and awkward driver bugs declared as “it is not a bug, it is a feature”. The infamous bug:

          This extension provides a    way for applications to discover when video
          memory content has been lost, so that the application can re-populate
          the video memory content as necessary.
      
      https://registry.khronos.org/OpenGL/extensions/NV/NV_robustn...

      This extension will be soon ten years old. At least they intend to fix it? They just didn’t in the past 9 years! Basically, video memory could be gone after Suspend/Resume, VT-Switch and so on. The good news is, after years someone figured that out and implemented a workaround. For X11 with GNOME:

      https://www.phoronix.com/news/NVIDIA-Ubuntu-2025-SnR

      I hope in the meantime somebody implemented a patch for Wayland.

      What we need? Reliability. And Linux support. That’s why I purchase AMD. And previously Intel.

      [1] I don’t judge whether implicit sync or explicit are better.

      • By adrian_b 2025-09-199:19

        AMD is not competing enough with NVIDIA, so they are not a solution.

        What I mean is that whenever NVIDIA removed features from their "consumer" GPUs in order to reduce production costs and increase profits, AMD immediately followed them, instead of attempting to offer GPUs that have something that NVIDIA does not have.

        Intel at least tries to be a real competitor, e.g. by offering much, much better FP64 performance or by offering more memory.

        If Intel's discrete GPUs disappear, there will be no competition in consumer GPUs, as AMD tries to compete only in "datacenter" GPUs. I have ancient AMD GPUs that I cannot upgrade to newer AMD GPUs, because the newer GPUs are worse, not better (for computational applications; I do not care about games), while Intel offers acceptable substitutes, due to excellent performance per $.

        Moreover, NVIDIA also had excellent Linux driver support for more than 2 decades, not only for games, but also for professional graphics applications (i.e. much better OpenGL support than AMD) and for GPU computing applications (i.e. CUDA). AMD gets bonus points for open-source drivers and much more complete documentation, but the quality of their drivers has been typically significantly worse.

        NVIDIA always had good support even for FreeBSD, where I had to buy discrete NVIDIA GPU cards for computers with AMD APUs that were not supported for any other OS except Windows and Linux.

        AMD "consumer" GPUs are a great choice for those who are interested only in games, but not for those interested in any other GPU applications. AMD "datacenter" GPUs are good, but they are far too expensive to be worthwhile for small businesses or for individuals.

      • By clhodapp 2025-09-190:031 reply

        I've found the amdgpu Linux driver to be fairly buggy running dual monitors with my Radeon VII, and found things like the fTPM to be highly buggy on Threadripper 2k/x399 to the point that I had to add a dTPM. They never got things truly working properly with those more-niche products before they just.. kind of... stopped working on them. And of course ROCm is widely regarded to be a mess.

        On the other hand, my Steam Deck has been exceedingly stable.

        So I guess I would say: Buy AMD but understand that they don't have the resources to truly support all of their hardware on any platform, so they have to prioritize.

        • By mjevans 2025-09-193:37

          I seem to recall the Vega era as 'when I wouldn't buy a GPU because AMDs were just unstable' (and of course never closed source Nvidia).

          Took me almost 5 min to drill through enough Wikipedia pages to find the Radeon VII string.

          https://en.wikipedia.org/wiki/List_of_AMD_graphics_processin... https://en.wikipedia.org/wiki/Radeon_RX_Vega_series

          Contrast that with the earlier R9 285 that I used for nearly 10 years until I was finally able to get a 9070XT that I'm very happy with. They were still refining support for that aged GCN 1.2 driver even today, even if things are a lower priority to backport.

          Overall the ONLY things I'm unhappy about this GPU generation.

          * Too damned expensive * Not enough VRAM (and no ECC off of workstation cards?) * Too hard for average consumers to just buy direct and cut out the scalpers

          The only way I could get my hands on a card was to buy through a friend that lives within range of a Microcenter. The only true saints of computer hardware in the whole USA.

      • By lmm 2025-09-191:171 reply

        > What we need? Reliability. And Linux support

        Both of which NVidia does a lot better in practice! I'm all for open-source in-tree drivers, but in practice, 15 years on, AMD is still buggy on Linux, whereas NVidia works well (not just on Linux but on FreeBSD too).

        > I don’t judge whether implicit sync or explicit are better.

        Maybe you should.

        • By shmerl 2025-09-195:483 reply

          > Both of which NVidia does a lot better in practice!

          Correction - if they care. And they don't care to do it on Linux, so you get them dragging feet for decades for something like Wayland support, PRIME, you name it.

          Basically, the result is that in practice they offer abysmally bad support, otherwise they'd have upstream kernel drivers and no userspace blobs. Linux users should never buy Nvidia.

          • By lmm 2025-09-197:531 reply

            > And they don't care to do it on Linux

            I don't understand what you're saying here. I've used NVidia on Linux and FreeBSD a lot. They work great.

            If your argument is they don't implement some particular feature that matters to you, fair enough. But that's not an argument that they don't offer stability or Linux support. They do.

            • By shmerl 2025-09-1917:551 reply

              Taking very long to implement stuff is a perfect argument of bad support for the platform. Timely support isn't any less important than support in general.

              • By jpc0 2025-09-206:111 reply

                Are you a product manager? Or do you just not see the irony on your comment?

                Long term support means my thing that has been working great continues to work great. New feature implementation has nothing to do with that and is arguably directly against long term support.

                And Nvidia seems justified in this since effectively no distro dropper X11 until Nvidia had support.

                • By shmerl 2025-09-211:381 reply

                  If you think taking decades is an acceptable rate while others do it in a timely manner it's your own problem. For any normal user it's completely unacceptable and is the opposite of great (add to it, that even after decades of dragging their feet they only offer half cooked support and still can't even sort out upstreaming their mess). Garbage support is what it is.

                  • By jpc0 2025-09-217:341 reply

                    AMD is notorious for not having ROCM support on in production currently sold GPUs, and horrendous bugs that actually make using the devices unusable.

                    I use AMD gpus on linux, I generally regret not just buying an Nvidia GPU purely because of AMDs lacklustre support for compute use cases in general.

                    Intel is still too new in the dGPU market to trust and on top of that there is so much uncertainty about whether that entire product line will disappear.

                    So at this point the CUDA moat makes is a non issue, on top of that what works works and keeps working, whereas with AMD I constantly wonder whether something will randomly not work after an update.

                    A timeline of decades for “features” your biggest consumers don’t care about is a reasonable tradeoff, even more so if actually pushing those features would reduce stability.

                    • By shmerl 2025-09-218:32

                      That's exactly the point. Nvidia might care about industrial use cases, while they don't care about desktop Linux usage and their support is garbage in result.

          • By jgb1984 2025-09-197:29

            I've been using Nvidia gpus exclusively on debian linux for the past 20 years, using the binary Nvidia drivers. Rock solid stability and excellent performance. I don't care for Wayland as I plan to stay on Xorg + Openbox for as long as I can.

          • By bigyabai 2025-09-1915:091 reply

            Wayland support hasn't been an issue since GLX was depreciated for EGLStream. I think the Nvidia backend has been "functional" for ~3 years and nearly flawless for the past year or so.

            Both Mutter and KWin have really good Nvidia Wayland sessions nowadays.

            • By shmerl 2025-09-1917:51

              It got better, but my point is how long it took to get better. That's the indicator of how much they care about Linux use cases in general. Which is way below acceptable level - it's simply not their priority (which is also exacerbated by their hostile approach to upstreaming).

              I.e. if anything new will need something implemented tomorrow, Nvidia will make their users wait another decade again. Which I consider an unacceptable level of support and something that flies in the face of those who claim that Nvidia supports Linux well.

      • By guerrilla 2025-09-198:49

        Buying AMD (for graphics) has been the only ethical choive for a long time. We must support the underdogs. Since regulation has flown the coop, we must take respondibility ourselves to fight monopolies. The short term costs may be a bit higher but the long term payoff is the only option for our self-interest!

        / steps down from soap box /

      • By mort96 2025-09-1910:201 reply

        > Conclusion: Buy AMD. Excellent Linux support with in-tree drivers.

        Funnily, AMD's in-tree drivers are kind of a pain in the ass. For up to a year after a new GPU is released, you have to deal with using mesa and kernel packages from outside your distro.. While if you buy a brand new nVidia card, you just install the latest release of the proprietary drivers and it'll work.

        Linux's driver model really is not kind to new hardware releases.

        Of course, I still buy AMD because Nvidia's drivers really aren't very good. But that first half a year was not pleasant last time I got a relatively recently released (as in, released half a year earlier) AMD card.

        • By account42 2025-09-1910:341 reply

          Use a better distro that includes drivers for new hardware.

          • By mort96 2025-09-1910:46

            A lot of people want to use Ubuntu or Ubuntu-based distros.

            I have since switched from Ubuntu to Fedora, maybe Fedora ships mesa and kernel updates within a week or two from release, I don't know. But being unable to use the preferred distro is a serious downside for many people.

      • By est31 2025-09-1822:391 reply

        > Excellent Linux support with in-tree drivers. For 15 years!

        Linux support has been excellent on AMD for less than 15 years though. It got really good around 10 years ago, not before.

        • By kimixa 2025-09-190:541 reply

          ATI/AMD open source linux support has been blowing hot and cold for over 25 years now.

          They were one of the first to actually support open source drivers, with the r128 and original radeon (r100) drivers. Then went radio silence for the next few years, though the community used that as a baseline to support the next few generations (r100 to r500).

          Then they reemerged with actually providing documentation for their Radeon HD series (r600 and r700), and some development resources but limited - and often at odds with the community-run equivalents at the time (lots of parallel development with things like the "radeonhd" driver and disagreements on how much they should rely on their "atombios" card firmware).

          That "moderate" level of involvement continued for years, releasing documentation and some initial code for the GCN cards, but it felt like beyond the initial code drops most of the continuing work was more community-run.

          Then only relatively recently (the last ~10 years) have they started putting actual engineering effort into things again, with AMDGPU and the majority of mesa changes now being paid for by AMD (or Valve, which is "AMD by proxy" really as you can guarantee every $ they spend on an engineer is $ less they pay to AMD).

          So hopefully that's a trend you can actually rely on now, but I've been watching too long to think that can't change on a dime.

          • By ahartmetz 2025-09-191:47

            It is possible that at some point, maybe 15 years ago, AMD provided sufficient documentation to write drivers, but even 10 years ago, a lot of documentation was missing (without even mentioning that fact), which made trying to contribute rather frustrating. Not too bad, because as you said, they had a (smallish) number of employees working on the open drivers by then.

      • By hdjfjzhej 2025-09-191:351 reply

        Agreed! This is great news for AMD and users.

        Those who want to run Linux seriously will buy AMD. Intel will be slowly phased out, and this will reduce maintenance and increase the quality of anything that previously had to support both Intel and AMD.

        However, if Microsoft or Apple scoop up AMD, all hell will break loose. I don’t think either would have interest in Linux support.

        • By account42 2025-09-1910:44

          > Agreed! This is great news for AMD and users.

          Less competition is NOT good news for AMD users. Their CPUs are already at lot less competitively priced now that they beat Intel for market share.

      • By trklausss 2025-09-199:06

        Oh boy that strikes a nerve with the "Video memory could be gone after Suspend/Resume". Countless hours lost trying to fix a combination of drivers and systemd hooks for my laptop to be able to suspend/hibernate and wake up back again without issues... Which makes it even more complicated when using Wayland.

        I have been looking at high-end laptops with dedicated AMD Graphics chip, but can't find many... So I will probably go with AMD+NVidia with MUX switch, let's see how it goes... Unless someone else has other suggestions?

      • By codedokode 2025-09-1822:19

        > Basically, video memory could be gone after Suspend/Resume, VT-Switch and so on.

        This actually makes sense: for example, a new task has swapped out previous task's data, or host and guest are sharing the GPU and pushing each others data away. I don't understand why this is not a part of GPU-related standards.

        As for solution, discarding all the GPU data after resume won't help? Or keeping the data in the system RAM.

      • By jacobgorm 2025-09-195:54

        Last I tried to file a bug for a crash in an AMD Windows driver I had to through an anonymous developer I found on Discord, and despite weeks of efforts writing and sharing test case they choose to ignore the bug report I the end. The developer even asked not to be named as he might face repercussions for trying to help out.

      • By ekianjo 2025-09-1823:20

        Excellent Linux support. Except for ROCm which is a big mess.

      • By bobajeff 2025-09-1822:441 reply

        I once had an mini pc with Nvidia. I got it for Cuda dev. One day the support for it was dropped for it so I was unable to update my system without it messing things up. So regardless of Cuda I decided Nvidia is not for me.

        However, doing research when buying a new pc, I've found that AMD kind of sucks too. ROCm isn't even supported on many of the systems i was looking into. Also, I've heard their Linux graphics drivers are poor.

        So basically I just rock a potato with Intel integrated graphics now. GPUs cost too much to deal with that nonsense.

        • By ahartmetz 2025-09-191:491 reply

          I would really disagree with AMD's Linux graphics drivers are poor. Only ROCm is.

          • By bobajeff 2025-09-193:20

            In your case maybe, but not according to some of the comments here in this very thread and also in some forums and YouTube videos back when I'd last checked.

      • By bigyabai 2025-09-1823:09

        FWIW, my experience gaming/web browsing/coding on a 3070 with modern drivers has been fine. Mutter and KWin both have very good Wayland sessions if you're running the new (>550-series) drivers.

    • By bee_rider 2025-09-1817:558 reply

      Apparently it is 5% ownership. Does that give them enough leverage to tank Intel’s iGPUs?

      That would seem weird to be. Intel’s iGPUs are an incredibly good solution for their (non-glamorous) niche.

      Intel’s dGPUs might be in a risky spot, though. (So… what’s new?)

      Messing up Intel’s iGPUs would be a huge practical loss for, like, everyday desktop Linux folks. Tossing out their dGPUs, I don’t know if it is such a huge loss.

      • By sodality2 2025-09-1818:242 reply

        > Tossing out their dGPUs, I don’t know if it is such a huge loss

        It would be an enormous loss to the consumer/enthusiast GPU buyer, as a third major competitor is improving the market from what feels like years and years of dreadful price/perf ratio.

        • By behringer 2025-09-1823:05

          They were one release away from completely upending the market. A sad day this is.

        • By cluckindan 2025-09-1818:311 reply

          You don’t say… on the very same day AMD launched a new RDNA3 card (RX 7700).

          Literally a previous gen card.

          • By sim7c00 2025-09-1819:36

            amd is slow and steady. they were behind many times and many times they surprrised with amazing innovations overtaking intel. they will do it again, for both CPU and GPU.

      • By tart-lemonade 2025-09-1819:134 reply

        Intel's iGPUs don't seem very at risk because the market for low-power GPUs isn't very profitable to begin with. As long as Nvidia is able to sell basically any chip they want, why waste engineering hours and fab time on low-margin chips? The GT 1030 (Pascal) never got a successor, so that line is as good as dead.

        Even before the Pascal GTs, most of the GT 7xx cards, which you would assume were Maxwell or Kepler from the numbering, were rebadged Fermi cards (4xx and 5xx)! That generation was just a dumping ground for all the old chips they had laying about, and given the prominence of halfway decent iGPUs by that point, I can't say I blame them for investing so little in the lineup.

        That said, the dGPUs are definitely somewhat at risk, but I think the risk is only slightly elevated by this investment, given that it isn't exactly a cash cow and Intel has been doing all sorts of cost-cutting lately.

        • By hakfoo 2025-09-195:45

          Aren't a lot of those cards sold for the audience that needs more display heads rather than necessarily performance?

          This has been somewhat improved-- some mainboards will have HDMI and DisplayPort plumbed to the iGPU, but the classic "trader desk" with 4-6 screens hardly needs a 5090.

          They could theoretically sell the same 7xx and 1030 chips indefinitely. I figure it's a static market like those strange 8/16Mb VGA chipsets that you sometimes see on server mainboards, just enough hardware to run diagnostics on a normally headless box.

        • By xp84 2025-09-1819:242 reply

          Agree. Not only would there be no money in it to try to replace Iris graphics or whatever they call them now -- it would be ultra pointless because the only people buying integrated graphics are those where gaming, on-device AI, and cryptocurrency aren't even part of the equation. Now, that is like 80%+ of the PC market, but it's perfectly well served already.

          I saw this move more as setting up a worthy competitor to Snapdragon X Elite, and it could also probably crush AMD APUs if these RTX things are powerful.

          • By behringer 2025-09-1823:031 reply

            Intel sells discrete cards and their next card was setup to do AI and games competently. They were poised to compete with the low to mid range Nvidia cards at HALF the cost.

            It was definitely going to upset the market. Now i understand the radio silence on a card that was supposed to have been coming by Xmas.

            • By xp84 2025-09-1918:571 reply

              Oh for sure. Arc is in jeopardy. Though tbh it was already, wasn't it? Can't you see an alternate universe where this story never happened, but Intel announced today "Sorry, because our business is dying in general and since Arc hasn't made us a ton of money yet anyway, we need to cut Arc to focus on our core blah blah blah".

              I just meant their integrated GPUs are what's completely safe here.

              • By behringer 2025-09-1919:051 reply

                I doubt it's safe, it competes directly with Nvidia on handhelds.

                Also the arc wasn't in jeopardy, the arc cards have been improving with every release and the latest one got pretty rave reviews.

                • By xp84 2025-09-1921:52

                  It wasn't in jeopardy for being no good, it was in jeopardy because Intel is so troubled. Like the Bombardier C-Series jet: Everyone agreed it was a great design and very promising, but in the end they had no choice but to sell it to Airbus (who calls it the A220), I think because they didn't really have the money to scale up production. In like manner, Intel lacks the resources to make Arc the success it technically deserves to be, and without enough scale, they'll lose money on Arc, which Intel can hardly afford at this point.

          • By kmacdough 2025-09-1819:374 reply

            Calling BS on "gaming not part of the equation". Several of my friends and I have exclusively games on integrated graphics. Sure we don't play the most abusively unoptimized AAA games like RDR2. But we're here and we're gaming.

            • By utternerd 2025-09-1820:522 reply

              RDR2 is quite optimized. We spend a lot of time profiling before release, and while input latency can be a tad high, the rendering pipeline is absolutely highly optimized as exhibited by the large amount of benchmarks on the web.

              • By purpleflame1257 2025-09-1821:02

                This is why I love HN. You get devs from any software or hardware project you care to name showing up in the comments.

              • By uncircle 2025-09-196:29

                RDR2 ran beautifully on Linux for me. If you were part of the team, excellent work.

            • By xp84 2025-09-1819:581 reply

              Sorry, I'm happy for you, and I do play Minecraft on an iGPU. I just meant that about 80% of the PCs sold seem to be for "business use" or Chromebooks, and the people writing those POs aren't making their selections with gaming in mind.

              (And also, I'm pretending Macs don't exist for this statement. They aren't even PCs anymore anyway, just giant iPhones, from a silicon perspective.)

              • By og_kalu 2025-09-1821:321 reply

                RDD2, Ghosts Of Tsushima, Black Myth Wukong. These games will play at 40 to 50 + fps at 1080p low to medium on the intel ARC igpus (no AI upscaling).

                To anyone actually paying attention, igpus have come a long way. They are no longer an 'I can play minecraft' thing.

                • By xp84 2025-09-1919:06

                  That performance is not surprising, Arc seems pretty dope in general.

                  I hadn't realized that "Arc" and "Integrated" overlapped, I thought that brand and that level of power was only being used on discrete cards.

                  I do think that integrated Arc will probably be killed by this deal though, not for being bad as it's obviously great, rather for being a way for Intel to cut costs with no downsides for Intel. If they can make RTX iGPUs now, and the Nvidia and RTX brand being the strongest in the gaming space... Intel isn't going to invest the money in continuing to develop Arc, even if Nvidia made it clear that they don't care, it just doesn't make any business sense now.

                  That is a loss for the cause of gaming competition. Although having Nvidia prop up Intel may prove to be a win for competition in terms of silicon in general versus them being sold off in parts, which could be a real possibility it seems.

            • By fluoridation 2025-09-1820:101 reply

              "Gaming" = "real-time-graphics-intensive application". You could be playing chess online, or emulated SNES games, but that's not what "gaming" refers to in a hardware context.

            • By KronisLV 2025-09-1820:22

              > Sure we don't play the most abusively unoptimized AAA games like RDR2.

              Wait, RDR2 is badly optimized? When I played it on my Intel Arc B580 and Ryzen 7 5800X, it seemed to work pretty well! Way better than almost any UE5 title, like The Forever Winter (really cool concept, but couldn't get past 20-30 FPS, even dropping down to 10% render scale on a 1080p monitor). Or with the Borderlands 4 controversy, I thought there'd be way bigger fish to fry.

        • By jandrese 2025-09-1819:28

          It would be amusing to see nVidia cores integrated into the chipset instead of the Intel GPU cores. I doubt that is in the cards unless Intel is looking to slash the workforce by firing all of their graphics guys.

        • By TiredOfLife 2025-09-194:05

          Out of 9 desktop GT 7xx cards only 2 were Fermi rest were Kepler.

          Out of 12 mobile GT 7xx cards only 3 were Fermi (and 2 of those were M and not GT) rest were Kepler.

      • By freedomben 2025-09-1818:07

        I would guess Nvidia doesn't care at all about the iGPUs, so I agree they are probably not at risk. dGPUs though I absolutely agree They are in a risky spot. Perhaps Intel was planning to kill their more ambitious GPU goals anyway, but That seems extremely unhealthy for pretty much everyone except Nvidia

      • By dijit 2025-09-1819:02

        5% of Ubisoft was all it took for Tencent to have very deep reaching ramifications.

        They were felt at an IC level.

      • By monocasa 2025-09-1821:25

        We'd have to see their cap table approximation, but I've seen functional control over a company with just a hair over 10% ownership given the voting patterns of the other stock holders.

        5% by about any accounting makes you a very, very influential stockholder in a publicly traded company with a widely distributed set of owners.

      • By misiek08 2025-09-1822:57

        Intel was already dead, even money from gov didn’t help them. It is old, legacy, bad corp. I think NV just wants to help them and use however it wants - Intel management will do anything they say.

      • By beached_whale 2025-09-1823:061 reply

        Intels gpus are a better solution for almost all computing outside high end gaming, ai, and a few other tasks. For most things a better gpu is overkill and wastes energy

        • By JustExAWS 2025-09-1823:481 reply

          So they are better except in all the ways that people care about…

          • By beached_whale 2025-09-2220:37

            Most computing tasks are not those. They may care, but boring is the norm.

      • By giveita 2025-09-1820:451 reply

        Would be antitrust right?

        • By monocasa 2025-09-1821:27

          Which would take an administration that cared about enforcing anti trust for the stated reasons behind anti trust laws.

    • By arkmm 2025-09-1821:484 reply

      This misses the forest from the trees IMO:

      - The datacenter GPU market is 10x larger than the consumer GPU market for Nvidia (and it's still growing). Winning an extra few percentage points in consumer is not a priority anymore.

      - Nvidia doesn't have a CPU offering for the datacenter market and they were blocked from acquiring ARM. It's in their interest to have a friend on the CPU side.

      - Nvidia is fabless and has concentrated supplier and geopolitical risk with TSMC. Intel is one of the only other leading fabs onshoring, which significantly improves Nvidia's supplier negotiation position and hedges geopolitical risk.

      • By tw04 2025-09-190:163 reply

        > Nvidia doesn't have a CPU offering for the datacenter market and they were blocked from acquiring ARM. It's in their interest to have a friend on the CPU side.

        Someone should tell nvidia that. They sure seem to think they have a datacenter CPU.

        https://www.nvidia.com/en-us/data-center/grace-cpu-superchip...

        • By kimixa 2025-09-191:00

          I wonder if this signal a lack of confidence in their CPU offerings going forward?

          But there's always TSMC being a pretty hard bottleneck - maybe they just can't get enough (and can't charge close to their GPU offerings per wafer), and pairing with Intel themselves is preferable to just using Intel's Foundry services?

        • By gpm 2025-09-191:34

          > Someone should tell nvidia that

          To be fair from what I hear someone really should tell at least half of nvidia that.

        • By high_na_euv 2025-09-1912:36

          Jensen was literally talking about the need for x86 CPU on yesterdays webcast

      • By trhway 2025-09-190:441 reply

        >Nvidia is fabless and has concentrated supplier and geopolitical risk with TSMC.

        East India Company has been conducting continental wars on its own. A modern company with $4T valuation and a country-GDP-size revenue and possessing key military technology of today and tomorrow wars - AI software and hardware, including robotics - can successfully wage such a continental war through a suitable proxy, say an oversized private military contractor (especially if it massively armed with drones and robots), and in particular is capable of defending an island like Taiwan. (or thinking backwards - an attack on Taiwan would cause a trillion or two drop in NVDA valuation. What options get on the table when there is a threat of a trillion dollar loss ... To compare - 20 years of Iraq cost 3 trillions, ie. 150B/year buy you a lot of military hardware and action, and efficient defense of Taiwan would cost much less than that.)

        • By purpleflame1257 2025-09-1912:242 reply

          Defending against territorial conquest is considerably easier than defending against kinetic strikes on key manufacturing facilities

          • By trhway 2025-09-1922:17

            Not necessarily. Territorial war requires people. Defense from kinetic strikes on key objects concentrated on smallish territory requires mostly high-tech - radars and missiles - and that would be much easier for a very rich high-tech US corporation.

            An example - Starlink antenna, sub-$500, a phased array which actually is like a half or a third of such an array on a modern fighter jet where it cost several millions. Musk naturally couldn't go the way of a million-per-antenna, so he had to develop and source it on his own. The same with anti-missile defense - if/when NVDA gets to it to defend the TSMC fabs, NVDA would produce such defense systems orders of magnitude cheaper, and that defense would work much better than the modern military systems.

          • By AtlasBarfed 2025-09-1914:53

            If China bombs tsmc, we blockade the Malacca straits.

            China's economy shuts down in a month, their population starves in another month

      • By throwaway2037 2025-09-190:043 reply

            > Nvidia is fabless and has concentrated supplier and geopolitical risk with TSMC. Intel is one of the only other leading fabs onshor[e]
        
        TSMC is building state of the art fabs in Arizona, USA. Samsung in Texas, USA. I assume these are being done to reduce geopolitical risk on all sides.

        Something that I never read about: Why can't NVidia use Samsung fabs? They are very close to TSMC state of the art.

        • By re-thc 2025-09-192:50

          > They are very close to TSMC state of the art.

          They're not. Most have tried at 1 point. Apple had a release with TSMC + Samsung and users spotted a difference. There was quite a bit of negativity.

        • By high_na_euv 2025-09-1912:391 reply

          TSMC will not have state of the art on US soil.

          Taiwanese gov prevents them from doing it. Leading node has to be on Taiwanese soil

        • By mrheosuper 2025-09-192:30

          Maybe after being bitten by Samsung on their RTX3000 GPU. Power Spike and a lot of heat.

      • By behringer 2025-09-1823:07

        Intel just released a halfway decent workstation (eg data center) card and we were expecting an even better set of cards by Xmas before this happened.

    • By lacy_tinpot 2025-09-1818:114 reply

      Not necessarily true. This might be a Microsoft funding a bankrupt Apple kind of moment.

      American competition isn't a zero sum, and it's in Nvidias' best interest to keep the market healthy.

      • By rapind 2025-09-1818:192 reply

        > American competition isn't a zero sum, and it's in Nvidias' best interest to keep the market healthy.

        Looking at Google's recent antitrust settlement, I'm not sure this is true at present.

        • By tonyhart7 2025-09-1818:42

          Google literally "won" antitrust case ???

          the fact that google pay firefox anually meaning that its in best interest of google that there is no monopoly, judge says

        • By lacy_tinpot 2025-09-1819:243 reply

          Nvidia's options are fund your competition to keep the market dynamic, or let the government do it by breaking you a part.

          So yes. That's how American competition works.

          It isn't a zero sum game. We try to create a market environment that is competitive and dynamic.

          Monopolies are threat to both the company and a free open dynamic market. If Nvidia feels it could face an antitrust suit, which is reasonable, it is in its best interest to fund the future of Intel.

          That's American capitalism.

          • By NooneAtAll3 2025-09-1819:282 reply

            > or let the government do it by breaking you a part.

            Looking at Google's recent antitrust settlement, I'm not sure this is true at present.

            • By prasadjoglekar 2025-09-1820:07

              There are at least 2 more anti trust suits against Google on going. One is about to enter the remedies phase in Virginia.

            • By lacy_tinpot 2025-09-1819:44

              Because the recent settlement determined, in my opinion correctly, that the market is still dynamic and competitive.

              Google search is genuinely being threatened.

              Google is not a monopoly, not entirely.

              If AI usage also starts accruing to Google then there should be a new antitrust suit.

          • By mcintyre1994 2025-09-1819:463 reply

            I can’t imagine Nvidia has any concerns about that with the current administration.

            • By account42 2025-09-1911:091 reply

              We have at least seen anti-trust suits proceed against Google under the current (US) administration. The same cannot be said for the previous one.

              • By mcintyre1994 2025-09-1912:27

                Are you referring to the case that started in 2023 under the previous administration?

            • By pedroma 2025-09-1821:39

              Will Nvidia continue to exist beyond the current administration? If yes, then would it be prudent to consider the future beyond the current administration?

            • By at-fates-hands 2025-09-1821:51

              But it did when Biden was in office?

          • By siva7 2025-09-198:15

            Which government? This one?

      • By rasz 2025-09-1818:561 reply

        Microsoft wasnt funding bankrupt Apple, Microsoft was settling lawsuit with Jobs just on the cusp of DOJ monopoly lwasuit. Microsoft was stealing and shipping Apple QuickTime sourcecode.

        https://www.theregister.com/1998/10/29/microsoft_paid_apple_...

        > handwritten note by Fred Anderson, Apple's CFO, in which Anderson wrote that "the [QuickTime] patent dispute was resolved with cross-licence and significant payment to Apple." The payment was $150 million

        • By humanfromearth9 2025-09-1819:48

          Wow quicktime... That's a name I haven't heard for a long time.

      • By villgax 2025-09-1818:251 reply

        You might want to re-read about that Apple-Microsoft incident.

        Quicktime got stolen by an ex-Apple employee & in return Apple had Microsoft commit money & promise to have Office suite available on macOS/OS X

      • By aDyslecticCrow 2025-09-1818:54

        One interesting parallel is Intel and AMD back in x86 1991, which is today the reason AMD is at all allowed to produce x86 without massive patent royalties to intel. [Asianometry](https://youtu.be/5oOk_KXbw6c) had a nice summery of it.

        Nvidia is leaning more into data centres, but lack a CPU architecture or expertise. Intel is struggling financially, but have knowledge in iGPUs and a vast amount of patents.

        They could have alot to give one another, and it's a massive win if it keeps intel afloat.

    • By fnord123 2025-09-1819:231 reply

      > Nvidia is actively hostile to drivers on Linux

      Nvidia is contributing to Nova, the new Nvidia driver for GSP based hardware.

      https://rust-for-linux.com/nova-gpu-driver

      Alexandre Courbot, an Nvidia dev, is comaintainer.

      https://www.phoronix.com/news/NOVA-Core-Co-Maintainer

      • By OccamsMirror 2025-09-191:101 reply

        Yeah I think Nvidia were hostile to Linux when they saw no value in it. Now it's where the machine learning is. It's the OS powering the whole AI hype train. Then there is also Steamdeck making Linux gaming not a complete write off anymore.

        The days of Nvidia ignoring Linux is over.

        • By spoaceman7777 2025-09-191:25

          For real. Nvidia is even selling desktop Linux computers now, with the launch of the DGX Spark.

          The "F you Nvidia" Linus Torvalds moment in 2012 is a meme that will not die.

    • By agentcoops 2025-09-1820:352 reply

      The article hints at it, but my guess would be this investment is intended towards Intel foundry and getting it to a place where NVIDIA can eventually rely on them over TSMC — and the ownership largely to give them upside if/when Intel stock goes up on news of an NVIDIA contract etc. It isn’t that uncommon of an arrangement for enterprise deals of such a potential magnitude. Long-term, however, and without NVIDIA making the call that could definitely have the effect of leading to Intel divesting from directly competing in as many markets, ie Arc.

      For context, I highly recommend the old Stratechery articles on the history of Intel foundry.

      • By janc_ 2025-09-1821:55

        My first thought was also that this relates to Intel's foundry business. Even if only to be able to use it in price negotiations with TSMC (it's hard to threaten to go elsewhere when there is no elsewhere left).

      • By ldng 2025-09-199:39

        Which is relatively funny since the new CEO has made it clear he wants to divest on the foundry side ...

    • By ksec 2025-09-216:41

      Do we want Intel to fall and bankrupt? Or do we want Intel to survive. I dont think most people are clear what is happening here. This is it. The Margin call moment.

      Intel could either transform into Fabless company compete on design, and manufacture with TSMC. Or they continue to be a Foundry player, crucial to US strategic interest. You can only pick one, and competing on just one of them is already a monumental task.

      GPU is burning money. With no short term success in sight that could make them cash flow positive in 3 - 4 years time frame. I have been stating this since 2016 and we are now coming close to 2026, recent market share suggest Intel is at less than 1% discrete market share. Especially given the strong roadmap Nvidia has.

      This gives a perfect excuse for Intel to quit GPU. Nvidia to provide the cash flow to hopefully continue to develop A18 and A14. Manufacture Nvidia GPU for them, and slowly transition itself out to only x86 + Foundry model. Or even solely manufacture for Nvidia. US administration further force Apple, Qualcomm and Broadcom to use Intel in some capacity. Assuming Intel can keep up with TSMC, which is probably a comparatively easier task than to tackle the GPU market.

      I am assuming the Intel board is happy with that direction though. Because so far they have shown they are completely lack of any strategic vision.

    • By random3 2025-09-1818:162 reply

      If only antitrust laws would exist

      • By bobby_mcbrown 2025-09-1819:20

        Or if monopoly laws like copyright didn't

      • By dsr_ 2025-09-1818:57

        and be enforced

    • By upboundspiral 2025-09-1821:081 reply

      This seems like it could be a long term existential threat for AMD. AMD CPU + GPU combos are finally coming out strong, both with MI300+ series in the supercomputing space, Strix Halo in laptops, etc. They have the advantage of being able to run code already optimized for x86 (important for gamers and HPC code), which NVIDIA doesn't have. Imagine if Grace Blackwell had x86 chips instead of Arm. If NVIDIA can get Intel CPUs with its chip offerings, it could be poised to completely take over so many new portions of the market/consolidate its current position by using its already existing mindshare and market dominance.

      • By hakfoo 2025-09-196:091 reply

        x86 is important for gamers, but is it for HPC? That tends to be far less dependent on binary blobs.

        • By high_na_euv 2025-09-1912:46

          Jensen was talking about it during webcast, so apparently yes

    • By jm4 2025-09-1819:45

      This seems more like the deal where Microsoft invested in Apple. It’s basically charity and they will flip it in a few years when Intel gets back on their feet.

    • By freedomben 2025-09-1818:04

      You absolutely nailed it IMHO. I wish I had more upvotes to give. I guess time will tell, but this seems like a clear conflict of interest.

    • By dchftcs 2025-09-2013:23

      Yeah, Nvidia has trillions at stake, Intel a mere 100B. It's more in the interests of Nvidia to interfere with Intel's GPU business than to help it, and the only things they want from Intel are the fabs.

    • By mihaaly 2025-09-1914:26

      Now the whole purchase makes sense! : /

      Using fortunes falling in the lap to kill competition is a common practice of economics (vs. technology) oriented organizations. That brings benefits only for the organization, for others it brings damages and disappointments.

    • By classicmotto 2025-09-192:28

      At this point Nvidia is just shooting themselves in the foot with hostility towards Linux - they are actively using Linux systems for DGX systems and the dependency on Linux is only going to grow internally.

    • By citizenpaul 2025-09-1820:10

      Something about this reminds me of other industry gobbling purchases. None of them ever turned out better for the product, price or general well being of society.

    • By mensetmanusman 2025-09-1818:151 reply

      Microsoft’s investment in Apple was helpful for the world.

      • By xp84 2025-09-1819:292 reply

        As an Apple user (and even an Apple investor), I'd rather that Apple went out of business back then. If we could re-roll the invention of the (mainstream) smartphone, maybe we'd get something other than two monopolistic companies controlling everything.

        For instance, maybe if there were 4 strong vendors making the devices with diverse operating systems, native apps wouldn't have ever become important, and the Web platform would have gotten better sooner to fill that gap.

        Or maybe it'd have ended up the same or worse. But I just don't think Apple being this dominant has been good for the world.

        • By tracker1 2025-09-1820:085 reply

          Or... we could still be using blackberry-like devices without much in the way of active/touch interface development at all. Or worse, the Windows CE or Palm with the pen things.

          • By magarnicle 2025-09-1822:461 reply

            Nah, the LG Prada phone would have taken over the world.

          • By rhetocj23 2025-09-1821:321 reply

            Lol exactly. That poster should quickly realise hes got it pretty good given the alternatives.

            • By xp84 2025-09-1918:471 reply

              Why? Was Steve Jobs literally the only human who was capable of seeing the massive unserved demand that existed back then?

              Sidekick was amazing for its time, but only on one also-ran carrier. BlackBerry had great features like BBM (essentially iMessage) but underpowered for multimedia and more difficult to learn. If Apple was out of business, one or more companies would have made the billions on MP3 players that iPod made, and any of them could have branched into phones and made a splash the same way. Perhaps Sony, perhaps Microsoft. Microsoft eventually figured it out -- the only reason they failed was that they waited for both Apple and Android to become entrenched so in this timeline they could have been the second-mover, but unlike with Apple and Android, maybe neither MS nor Google would have automatically owned the US marketshare the way Apple does[1]. If that were the case, we may have competition, instead of the unhealthy thing we have where Apple just does whatever they want.

              [1] https://gs.statcounter.com/vendor-market-share/mobile/united...

              • By rhetocj23 2025-09-1923:31

                With all due respect theres a simple answer to why Apple was destined to win the smartphone race - they had a 5 year lead over everyone else because they had the OS and touch interface tightly integrated. On top of that they managed to scale up the production of the glass necessary for the touch to work and partnered with a network provider to overcome the control network providers had over handset producers.

                They had such a lead that nobody was going to catch up and eat into their economic profits. Sure Samsung et al have captured marketshare, but not eaten into Apples economic profits.

                Whether you like it or not, this hard work, effort and creativity deserves to be rewarded - in the form of monopoly/oligopoly profits.

                Apple has shown itself to be very disciplined with its cash. That cannot be said of for Google, who instead of taking an endless stream of vanity projects, should return that cash back to shareholders.

          • By Lorin 2025-09-195:36

            I still miss my KeyOne keyboard.

          • By ahartmetz 2025-09-192:03

            BB10 was the shit. Fantastic OS and (some models) a great hardware keyboard. But it was already a response to the iPhone, wouldn't have happened without...

          • By mrheosuper 2025-09-192:35

            with focus on privacy and security? Sign me up.

        • By mensetmanusman 2025-09-193:011 reply

          Nope, we know exactly where it was headed. Phones controlled by carriers full of NFL ads.

          • By xp84 2025-09-1918:55

            There's nothing supernatural about Apple that meant only they could do something better than that shitty generation of devices. Remember, the portable consumer electronics market would certainly have other huge players if Apple hadn't existed to make the iPod. BlackBerry, Microsoft, and Sony come to mind. iPhone, based mainly on Apple's popularity from the iPod era, got a huge jump from that, and then the rush for native apps, which encourages consolidation, smothered every other company's competing devices (such as WebOS, BlackBerry 10, Windows Mobile) before they had a chance to compete.

            To be honest, Android may have met a similar fate if Apple had been able to negotiate a non-exclusive contract with Cingular/AT&T. My understanding though was that they had to give exclusivity as a bargaining chip to get all the then-unthinkable concessions, as yeah, every phone was full of garbage bloatware and festooned with logos.

    • By bitexploder 2025-09-192:09

      Markets seem to be at least reasonably competitive with 3 vendors. 2 vendors in a space often leads to less desirable outcomes for consumers.

    • By TiredOfLife 2025-09-194:08

      > wheras Nvidia is actively hostile to drivers on Linux

      Is that's why to use Cuda in Windows you have to use WSL2 (virtualized linux)

    • By 6r17 2025-09-1821:03

      Reading this after that memo about China's attitude to Nvidia is actually chilling - they just don't care do they ?

    • By sim7c00 2025-09-1819:35

      why would they wanna kill intel? amd has better cpus and also goes gpu better than intel :?? (*yes i may be missing things,. pls do tell!)

    • By bentt 2025-09-193:59

      I dont agree here. Nvidia could simply segment the market and keep intel on the low end of gpus. I think AMD is the real target/victim here.

    • By itsthecourier 2025-09-1818:28

      a usable top gaming Intel GPU at good price is a myth :,(

    • By tonyhart7 2025-09-1818:412 reply

      nah, nvidia wouldnt do that

      it would invite an DOJ case

      • By bootsmann 2025-09-1818:47

        Thats something Jensen Huang can do away with by bringing a golden gpu statue to the white house.

      • By jandrese 2025-09-1819:30

        Assuming the DoJ is functional and paying attention.

    • By cyanydeez 2025-09-1822:00

      No no, its in NVIDIA interest to ensure it's just good enough for the plebs, so they can continue to gouge the high rate market.

    • By matheusmoreira 2025-09-1817:422 reply

      I agree. As a Linux user who favors Intel hardware due to their Linux support, I gotta say the future looks bleak.

      • By throwaway2037 2025-09-190:081 reply

            > As a Linux user who favors Intel hardware due to their Linux support
        
        I'm confused here. Are you talking about Intel CPUs or GPUs? And does AMD not have excellent Linux support for their own CPUs and GPUs?

        • By matheusmoreira 2025-09-192:10

          > Intel CPUs or GPUs

          Both. Also things like sound cards, network cards, peripherals in general.

          My happiness and stability while using Linux has been well correlated with the number of devices with Intel in the name. Every single device without Intel invariably becomes a major pain in my ass every single time.

          It's gotten to the point I assume it will just work if it's Intel.

          > And does AMD not have excellent Linux support for their own CPUs and GPUs?

          They're making a lot of progress but Intel is still years ahead of them.

          Earlier this year I was researching PC parts for a build and discovered AMD was still working on merging functionality like on die temperature sensors into the kernel. It makes me think I won't have a full feature set on Linux if I buy one of their processors.

      • By tliltocatl 2025-09-1818:28

        Well, AMD isn't going away yet, and they do seem to have finally released the advantage of open-source drivers. But that's still bad very for competition and prices.

  • By littlecranky67 2025-09-1814:5317 reply

    This is a death blow to the Intel GPU+AI efforts and should not be allowed by the regulators. It is clear that Intel needs the downstream, low-cost GPU market segment to have a portfolio of AI chips based on chiplets, where most defective ones end up in the consumer grade GPUs based on manufacturing yield. NVidias interest is now for Intel not to enter either the GPU market, nor the AI market - which Intel was preparing for with its GPU efforts in recent years.

    • By paxys 2025-09-1817:363 reply

      The US government is itself a major shareholder in Intel, and has every incentive to push Intel stock over its competitors. It's almost a certainty that Nvidia was forced into this deal by the government as well. We are way beyond regulation here.

      • By sabhiram 2025-09-1819:111 reply

        Yep, there is absolutely no problem with that at all.

        Never imagined politics so obviously manipulating the talking heads with nary a care about perception.

      • By bee_rider 2025-09-1818:074 reply

        The US government isn’t (or at least shouldn’t be) profit-motivated anyway, so it isn’t obvious what their incentives are WRT Intel’s stock.

        • By jerf 2025-09-1818:465 reply

          They want a source of chips for the wars they want to conduct that is not either controlled by the party they want to go war with, or way way closer to the party they want to go to war with than they are. Buying a chunk of Intel is a way of making sure they do the things the government wants that will lead to that outcome. Or at least so the theory goes; I've got my own cynicism on this matter and wouldn't dream of tamping down on anyone else's.

          Right now if the US wants to go to war with China, or anyone China really really likes, they can expect with high probability to very quickly encounter major problems getting the best chips. AIUI the world has other fab capacity that isn't in Taiwan, and some of it is even in the US, but they're all on much older processes. Some things it's not a problem that maybe you end up with an older 500MHz processor, but some things it's just a non-starter, like high-end AI.

          Sibling commenters discussing profits are on the wrong track. Intel's 2024 revenue, not profits, was $53.1 billion. The Federal Government in 2024 spent $6,800 billion. No entity doing $1.8 trillion in 2024 in deficit spending gives a rat's ass about "profits". The US Federal government just spends what it wants to spend, it doesn't have any need to generate any sort of "profits" first. Thinking the Federal government cares about profits is being nowhere near cynical enough.

          • By paxys 2025-09-1818:52

            This is generally true even setting side the "war with China" angle. Intel is a large domestic company employing hundreds of thousands in a very critical sector, and the government has every incentive to prevent it from failing. In the last two decades we've bailed out auto companies and banks and US Steel (kinda) for the same reason.

          • By lebimas 2025-09-1819:25

            Concisely put. This is exactly the reasoning. The US is preparing for a potential war with China in 2026 or 2027, and this is how it is beginning preparations.

          • By mosura 2025-09-1820:29

            > Right now if the US wants to go to war with China

            The US is desperate to not have that war, because they spent so long in denial about how sophisticated China has become that it would be a total humiliation. What you see as the US wanting war is them simply playing catch up.

          • By auggierose 2025-09-192:443 reply

            I find it funny that people talk about a US/China war as a real possibility. You are aware that that would be the end of life on earth as we know it, right?

            • By jerf 2025-09-1914:04

              Unfortunately, "it would end life on Earth as we know it" is not, on its own terms, a thing that will stop it from happening. All it takes is the people who can make the decision deciding to do it because they think they will come out ahead, and not caring about what it may do to anyone else. And they don't even have to be right. They just have to think they will come out ahead.

              Don't mistake talking about a thing as advocating for that thing. It leaves you completely unable to process international politics, and frankly, a lot of other news and discussion as well. If you can only think about things you approve of, your model of the world is worse than useless.

            • By cutemonster 2025-09-1912:351 reply

              Pretty likely, I think, it'd be a geographically restricted war.

              The countries wouldn't fire nukes against each other's mainlands but maybe against each other's fleets. Pretty likely

              • By bee_rider 2025-09-1916:47

                We haven’t really tested the idea of a geographically restricted war. During the Cold War there were some pretty transparent proxy wars, but the proxy still allowed for backing out and saving face.

                I don’t think geographically restricting a war is even possible, really. The US’s typical game plan involves hitting the enemy’s decision-making capabilities faster than they can react. That goes out the window if we can’t hit each other’s mainlands. A war where we don’t get to use our strongest trick and China keeps their massive industrial base is an absurd losing one that the US would be totally nuts to sign up for.

                Anyway, we and China can be perfectly good peaceful competitors.

            • By Traubenfuchs 2025-09-197:21

              What even would be the goals of such wars?

              Destroy the other country?

              Take it over?

              Be in a 1984 style „fake“ war forever?

          • By bee_rider 2025-09-1818:51

            Sure, but this is an interesting independent of the government holding Intel stock.

            The US government always ought to have the interest of US companies in mind, their job is to work in the interest of the voters and a lot of us work for US companies.

        • By pbhjpbhj 2025-09-1818:151 reply

          They can buy enough stock to shift the price, then use that as a lever to control their own investments prices (and thence profits). Like they've done with tariffs.

          • By bee_rider 2025-09-1818:481 reply

            That sounds more like an abuse of government powers for individual gain than any legitimate government interest. If that was the plan it would make just as much sense to short a company and then announce a plan to put them under greater regulatory scrutiny.

            • By janc_ 2025-09-1822:041 reply

              You think they haven't done that sort of things yet?

              • By bee_rider 2025-09-194:481 reply

                Well, I wouldn’t be able to prove it if challenged. And anyway, it seems better overall to not start building the case that that’s just something we expect politicians to do.

                A shocking surprise needs to be a surprise for it to work. Call it strategic naivety if you want.

                • By nolist_policy 2025-09-197:571 reply

                  Donald Trump's erratic tariff policies are surprising.

                  Donald anounces tariffs and the markets react. He postpones tariffs and the markets react again. Only Donald and his friends know what he will announce next.

                  • By bee_rider 2025-09-1915:52

                    > Donald Trump's erratic tariff policies are surprising.

                    This feels like a misreading of what I wrote. The discovery that he is using tariffs to make a personal profit should be surprising.

                    > Donald anounces tariffs and the markets react. He postpones tariffs and the markets react again. Only Donald and his friends know what he will announce next.

                    That wouldn’t surprise me at all, I just don’t think a hypothesis about how he could abuse his power will be very compelling to anybody who doesn’t already think he’s prone to corruption. If anything, I think it starts inoculating people to the idea.

        • By nyc_data_geek1 2025-09-1818:11

          Shouldn't be, yes. Isn't? Have you seen the rhetoric around tariffs? A lot of people thought they wanted the government run like a business, so welcome to the for-profit government society.

      • By lawlessone 2025-09-1817:503 reply

        What happens now if one of these companies implodes? does it pull everything with it?

        • By YeahThisIsMe 2025-09-1817:521 reply

          Why would anything that isn't Intel implode? And what's "everything"?

          • By smegger001 2025-09-1819:061 reply

            a plateauing in AI development leading to another AI Winter causing dotcom bubble 2 electric boogaloo.

            • By yvdriess 2025-09-1914:401 reply

              If anything that would be a boost to Intel. One of their problems is GPU capex sucking the air out of the room.

              • By smegger001 2025-09-2112:08

                well the question i answered was "Why would anything that isn't Intel implode" and an AI winter and another dotcom boom would do that to everyone not named Intel.

        • By FirmwareBurner 2025-09-1818:053 reply

          Well, the AI bubble will eventually pop since none of the major AI chatbots are remotely profitable, even on OpenAI's eyewatering $200/month pay plan which very few have been willing to pay, and even on that OpenAI is still loosing money on it. And when it pops, so will Nvidia's stock, it's only a matter of time.

          The AI hype train was built on the premise that AI will progress linearly and eventually end up replacing a lot of well paid white collar work, but it failed to deliver on that promise by now, and progress has flatlined or sometimes even gone backwards (see GPT-5 vs 4o).

          FAANG companies can only absorb these losses for so long before shareholders pull out.

          • By bee_rider 2025-09-1818:382 reply

            The AI bubble pop is probably not something NVIDIA is super looking forward to, but of anybody near the bubble they are the least likely to really get hurt by it.

            They don’t make AI chips really, they make the best high-throughput, high-latency chips. When the AI bubble pops, there’ll be a next thing (unless we’re really screwed). They’ve got as good chance of owning that next thing as anybody else does. Even better odds if there are a bunch of unemployed CUDA programmers to work on it.

            • By rusk 2025-09-197:12

              There will be a dramatic reduction in “demand” and Nvidia will be stuck with a massive “surplus”

              There will undoubtedly still be a market for Nvidia chips but it won’t be enough to keep things going as they are.

              A new market opening up with the same demand as AI just at the point that AI pops would be a miracle. Something like being an unsecured bond holder in 2010.

            • By FirmwareBurner 2025-09-197:561 reply

              >When the AI bubble pops, there’ll be a next thing

              And what is that post-AI bubble "next big thing" exactly?

              If there were, you'd already see people putting their money towards it.

              • By bee_rider 2025-09-1912:171 reply

                If I knew I’d definitely keep it to myself and make a bunch of money.

          • By erichocean 2025-09-1818:161 reply

            > AI will replace a lot of well paid white collar work, but it failed to deliver on that promise

            This is comically premature.

            • By FirmwareBurner 2025-09-1818:20

              >This is comically premature.

              When you follow the progress in the last 12 months, it really isn't. Big AI companies spent "hella' stacks" of cash, but delivered next to no progress.

              Progress has flatlined. The "rocket to the moon" phase has already passed us by now.

          • By bdamm 2025-09-1818:133 reply

            The white collar worker doesn't need to be replaced for the bots to be profitable. They just need to become dependent on the bots to increase their productivity to the point where they feel they cannot do their job without the chatbot's help. Then the white collar worker will be happy to fork over cash. We may already be there.

            Also never forget that in technology moreso than any other industry showing a loss while actually secretly making a profit is a high art form. There is a lot of land grabbing happening right now, but even so it would be a bit silly to take the profit/loss public figures at face value.

            • By FirmwareBurner 2025-09-1818:241 reply

              >We may already be there.

              Numbers prove we aren't. Sales figures show very few customers are willing to pay $200 per month for the top AI chatbots, and even at $200/month, OpenAI is still taking a loss on that plan so they're still loosing money even with top dollar customers.

              I think you're unaware just how unprofitable the big AI products are. This can only go on for so long. We're not in the ZIRP era anymore where SV VC funded unicorns can be unprofitable indefinitely and endlessly burn cash on the idea that when they'll eventually beat all competitors in the race to the bottom and become monopolies they can finally turn a profit by squeezing users with higher real-world price. That ship has sailed.

              • By blonder 2025-09-1818:422 reply

                I don't think you can confidently say how it will pan out. Maybe OpenAI is only unprofitable at the 200/month tier because those users are using 20x more compute than the 20/month users. OpenAI claims that they would be profitable if they weren't spending on R&D [1], so they clearly can't be hemorrhaging money that badly on the service side if you take that statement as truthful.

                [1] https://www.axios.com/2025/08/15/sam-altman-gpt5-launch-chat...

                • By rhetocj23 2025-09-1821:36

                  "OpenAI claims that they would be profitable if they weren't spending on R&D "

                  Ermmm dude they are competing with Google. They have to keep reinvesting otherwise Google captures the users OAI currently has.

                  Free cash flows matter. Not accounting earnings. On a FCFF basis they largely in the red. Which means they have to keep raising money, at some point somebody will turn around and ask the difficult questions. This cannot go on forever.

                  And before someone mentions Amazon... Amazon raised enough money to sustain their reinvestment before they eventually got to the place where their EBIT(1-t) was greater than reinvestment.

                  This is not at all whats going on with OAI.

                • By FirmwareBurner 2025-09-197:58

                  >OpenAI claims [...]

                  If you're gonna buy at face value whatever Scam Altman claims, then I have some Theranos shares you might be interested in.

            • By rusk 2025-09-197:14

              > They just need to become dependent on the bots to increase their productivity to the point where they feel they cannot do their job without the chatbot's help

              Correct, but said technology needs to be self sustaining commercially. The cost the white collar worker pays needs to be enough to cover the cost of running the AI + profit

              It seems like we are a long way off that yet but maybe we expect an AI to solve that problem ala Kurzweil

            • By safety1st 2025-09-194:091 reply

              Why are this and the first reply being downvoted? Perfectly legitimate thoughts.

              Anyway, I'd just point out that users don't even need to depend on the bots for increase productivity, they just need to BELIEVE it increases their productivity. Exhibit A being the recent study which found that experienced programmers were actually less productive when they used an LLM, even though they self-reported productivity gains.

              This may not be the first time the tech industry has tricked us into thinking it makes us more productive, when in reality it's just figuring out ways to consume more of our attention. In Deep Work, Cal Newport made the argument that interruptive "network tools" in general decrease focus and therefore productivity, while making you think that you're doing something valuable by staying constantly connected. There was a study on this one too. They looked at consultants who felt that replying as quickly as possible to their clients, even outside of work hours, was important to their job performance. But then when they took the interruptive technologies away, spent more time focusing on their real jobs, and replied to the clients less often, they started producing better work and client feedback scores actually went up.

              Now personally I haven't stopped using an LLM when I code but I'm certainly thinking twice about how I use it these days. I actually have cut out most interruptive technology when I work, i.e. email notifications disabled, not keeping Slack open, phone on silent in a drawer, etc. and it has improved my focus and probably my work quality.

        • By nradov 2025-09-1817:511 reply

          too big to fail

    • By elAhmo 2025-09-1816:583 reply

      Regulators? In this administration?

      There is no such thing.

      • By usefulcat 2025-09-1817:17

        Oh, there absolutely is where freedom of the press is concerned. Look no further than the new 'bias monitor' at CBS.

      • By NewJazz 2025-09-1817:062 reply

        Wdym FCC just shut down that antifa Jimmothy Kimmithy.

        • By SlightlyLeftPad 2025-09-1817:193 reply

          Not shut down; regulated.

          • By SlightlyLeftPad 2025-09-195:48

            For everyone who downvoted. This was intended to be tongue-in-cheek, yes they are effectively the same thing.

          • By lobsterthief 2025-09-1817:393 reply

            The FCC does not have the power to shut down broadcasts based on their content.

            • By NewJazz 2025-09-1818:03

              It does have the power to intimidate broadcasters and pressure them in a variety of ways.

            • By Zacharias030 2025-09-1817:45

              ostensibly it does.

            • By cactacea 2025-09-1818:021 reply

              Sure seems like they're trying to invent one

              • By lobsterthief 2025-09-1822:56

                Yes, exactly. They’re exercising a power they don’t have, which should not hold up in a lawful country. Obviously, here we are.

        • By maxlin 2025-09-1817:335 reply

          [flagged]

          • By lobsterthief 2025-09-1817:383 reply

            He didn’t say anything violent. Have you watched the monologue?

            Even if he did (which he didn’t), I don’t see Fox shutting down anything when one of their presenters recently stated, on air, that we should euthanize our homeless population.

            • By InitialLastName 2025-09-1818:212 reply

              To be clear (not that I agree with this situation): Fox News (where that presenter works) is a cable network, beholden to the cable providers but not a broadcaster. The FCC has relatively little leverage to regulate it, because it does not rely on broadcast licenses.

              ABC is a broadcast network. It relies on a network of affiliates (largely owned by a few big companies) who selectively broadcast its programming both over the airwaves and to cable providers. Those affiliates have individual licenses for their radio broadcasting bandwidth which the FCC does have leverage over (and whose content the FCC has a long history of regulating, but not usually directly over politics, e.g. public interest requirements, profanity, and obscenity laws).

              • By mcmcmc 2025-09-1822:48

                Let’s not pretend that the Trump admin would’ve done anything about it even if they did have leverage. They actively encourage and participate in violent rhetoric when it’s directed towards their perceived enemies. Which includes the homeless.

            • By nerdponx 2025-09-1817:47

              To be fair I don't see what the FCC has to do with it. This is classic Manufacturing Consent behavior.

            • By maxlin 2025-09-1821:302 reply

              Of course I watched it, many times. I didn't say he said anything directly violent, but he spread hateful disinformation about someone's death, entirely against FBI's findings and common sense, during a time of the highest temperatures in a while. Just to try to win the attention of people that'd rather not look in the mirror.

              This is exactly the kind of disingenuous, dehumanizing behavior that radicalizes people like Tyler. And saying that right now would be like if Reagan got in to a spat about something personal during the cold war.

              • By lobsterthief 2025-09-1822:571 reply

                If his administration’s concern was about turning the temperature up, they are doing absolutely nothing to turn it down.

                • By maxlin 2025-09-191:24

                  Firstly, being human about the death, then being transparent about the investigation are the most important things they could be doing, and they are doing that.

                  Idk how the antifa terror thing is going to go, but that really sounds like a loong time coming. Best by far would now be for the left to take some responsibility, not sink deeper in to their "good, x right-winger next" kind of hate spiraling.

              • By pseudalopex 2025-09-195:321 reply

                > he spread hateful disinformation about someone's death, entirely against FBI's findings and common sense

                Did you mean Kimmel asserted the shooter was MAGA? He did not.

                • By maxlin 2025-09-2022:58

                  That is literally exactly what he did say. Absolutely disgraceful. Glad he got fired.

                  Quoting:

                  > We hit some new lows over the weekend with the MAGA gang desperately trying to characterize this kid who murdered Charlie Kirk as anything other than one of them, and doing everything they can to score political points from it

          • By yoyohello13 2025-09-1817:511 reply

            Did you actually watch the clip? Or are you just repeating what you heard on social media?

            • By maxlin 2025-09-1821:221 reply

              Yes I did. The whole clip. A few times. Not easy to watch.

              • By lobsterthief 2025-09-1822:581 reply

                The fact you’re taking this opportunity to state this clip was hard for you to watch says a lot about your ability to consider another’s perspective in this conversation. I’m out.

                • By maxlin 2025-09-191:43

                  "Another perspective"? Are you for real?

                  This is what the absolute scumbag said:

                  > We hit some new lows over the weekend with the MAGA gang desperately trying to characterize this kid who murdered Charlie Kirk as anything other than one of them, and doing everything they can to score political points from it

                  Having seen the murder pretty much live. A father with "prove me wrong" written on the tent he was sitting in, taking non-prescreened questions and debating with everyone that dared to come up to the mic. Shot with kids next to him. The most direct attack on debate itself we've seen during our lifetimes; it wouldn't matter if it was Sanders or any other left-wing, centrist, or right-wing figure doing very reasonable debate there. And after the murder so many heartless people came out to MOCK the death, and celebrate it.

                  Kimmel is lying against not just common sense but what authorities and people around Tyler have said. Absolutely nothing points at the murderer being maga, but pretty much every detail points towards him having been radicalized by the left. Unless you're just going to ignore people that knew him saying he was a leftie, Bella Ciao carved in to the bullets, and the very obvious of him having shot a right winger who was managing to change people's minds, etc.

                  And after that, you expect me to take Kimmels comment as level headed? You are hate trolling. You were succesful, I am actually angry. Regardless, I did watch everything and gave a fair response to it. The only people that would not be angered by Kimmel's comments knowing all this have no heart.

          • By hexator 2025-09-1817:361 reply

            Where exactly did he say anything remotely violent

            • By maxlin 2025-09-1821:241 reply

              If you watch the clip and know what's going on in the US right now, saying such vile disinformation now nothing but aims to up the temperature.

              That's the deciding reason he got shut down too. Absolute inability to read the "room", even though what he said would be ugly at any point.

              • By hexator 2025-09-1823:282 reply

                He didn't say anything violent and you are spreading misinformation saying he is. That is why your comment got removed.

                You can watch the monologue here https://www.youtube.com/watch?v=-j3YdxNSzTk&t=122s

                • By maxlin 2025-09-191:281 reply

                  My comment wasn't removed, and don't try to put words in my mouth. I said he fanned the flames of violence; raising temperatures with easily disproven misinformation during an extraordinary time just a week from the murder.

                  I've seen the clip, now you go see what the authorities have figured of the killer; Absolutely nothing points at the murderer being maga, but pretty much every detail points towards him having been radicalized by the left. Unless you're just going to ignore people that knew him saying he was a leftie, Bella Ciao carved in to the bullets, and the very obvious of him having shot a right winger who was managing to change people's minds, etc.

                • By dabluecaboose 2025-09-192:331 reply

                  [flagged]

                  • By qwerpy 2025-09-192:531 reply

                    > redditization of Hackernews

                    This has been very sad to see this past year.

                    • By slater 2025-09-192:551 reply

                      Please don't post comments saying that HN is turning into Reddit. It's a semi-noob illusion, as old as the hills.

                      https://news.ycombinator.com/newsguidelines.html

                      • By dabluecaboose 2025-09-193:25

                        You might feel smug and superior posting this, but it's quite ironic to post on one specific comment in an otherwise guidelines-breaking, off-topic, political circlejerk subthread like the one we're responding to.

                        This subthread is indeed reddit-esque: Started by a pithy, barely-constructive comment; and followed by pithy witticisms that add nothing to the conversation about Intel and Nvidia, but instead echo popular sentiments about the current administration without saying anything substantive. Meanwhile, the one contrarian opinion was instantly flagged and hidden, despite being level-headed and non-combative.

          • By jjkaczor 2025-09-1817:371 reply

            Actually - he didn't he made fun of Trump ditching the memorial service...

            Which Trump did.

            • By maxlin 2025-09-1821:261 reply

              You clearly didn't see the clip; what you describe was just a part of it, and there Trump appears to just have not heard the question.

              • By jjkaczor 2025-09-2216:21

                Trump doesn't seem to hear alot of questions - and has been reported to ramble on about all sorts of nonsense, during state visits and high-level meetings.

                Almost like a dementia patient.

                ...but that is just my opinion - even so... not clarifying an asked question is not, well uh a sign of overall "great leadership"...

          • By ohdeargodno 2025-09-1817:50

            [dead]

      • By jrochkind1 2025-09-193:351 reply

        They are superceding regulations with the government just arbitrarily ordering companies to do things instead.

        • By baq 2025-09-199:16

          Mussolini would be proud.

    • By bogwog 2025-09-1815:342 reply

      Intel had an opportunity to differentiate themselves by offering more VRAM than Nvidia is willing to put in their consumer cards. It seemed like that was where Battlemage was going.

      But now, are they really going to undermine this partnership for that? Their GPUs probably aren't going to become a cash cow anytime soon, but this thing probably will. The mindset among American business leaders of the past two decades has been to prioritize short-term profits above all else.

      • By etempleton 2025-09-1817:34

        It may be that Nvidia doesn’t really see Intel as a competitor. Intel serve a part of the GPU market that Nvidia has no interest in. This reminds me a bit of Microsoft’s investment into Apple. Microsoft avoided the scorn of regulators by keeping Apple around as a competitor and if they succeed, great, they make money off of the deal.

      • By pchangr 2025-09-1816:241 reply

        I remember when I was studying for an MBA.. a professor was talking about the intangible value of a brand .. and finance.. and how they would reflect on each other .. At some point we were decomposing the parts of a balance sheet and they asked if one could sell the goodwill to invest in something else .. and the answer was of course .. no… well.. America has proven us wrong .. the way you sell the goodwill is to basically enshittification.. you quickly burn all your brand reputation by lowering your costs with shittier products .. your goodwill goes to 0 but your income increases so stock go up .. the CEO gets a fat bonus for it .. even tho the company itself is destroyed .. then the CEO quickly abandons ship and does the same on their next company .. rinse and repeat… infinite money!

        • By poslathian 2025-09-1817:23

          We always called this “monetizing the brand” and it’s been annoying me since at least when Sperry when private equity and the shoes stopped being multi-year daily drivers

    • By yalogin 2025-09-1817:402 reply

      I don’t follow how it’s a death knell to intel AI chips. Nvidia bought shares, not a board seat. May be that’s the plan, but if you take the example of Microsoft buying apple shares that only gave apple a lifeline to build better. I do understand nvidia wants to have the whole gpu market to themselves but how will they do it?

      • By dragonwriter 2025-09-1817:52

        > Nvidia bought shares, not a board seat.

        I think the assumption there is that the strategic partnership that is part of the deal would in effect preclude Intel from aggressively competing with NVIDIA in that market, perhaps with the belief that the US governments financial stake in Intel would also lead to reduced anti-trust scrutiny of such an agreement not to compete.

      • By littlecranky67 2025-09-1818:131 reply

        They literally bought board seats - not today, but shares entitle you to vote on the board members on the next shareholder meeting. And 5$bn of shares buy you a lot of votes.

        • By andirk 2025-09-1818:50

          5$bn may not buy a huge amount of voting power, but if there are close votes on important things then it could be enough to affect the company. Keeping ones enemies closer, regardless of voting, can also help overall.

    • By benced 2025-09-1816:103 reply

      The likelihood intel AI was going to catch up with efforts like AWS Trainium, let alone Nvidia was already vanishingly small. This gives intel a chance at maintaining leading edge fab technologies.

      I feel bad for gamers - I’ve been considering buying a B580 - but honestly the consumer welfare of that market is a complete sidenote.

      • By nickysielicki 2025-09-1817:39

        I don’t agree. OneAPI gets a lot of things right that ROCM doesn’t, simply because ROCM is a 1:1 rip of what nvidia provides (warts and historical baggage included) whereas OneAPI was thoughtfully designed and did away with all of that. Intel has a strong history in networking, much stronger than Xilinx/AMD, and really was the best hope we had for an open standard to replace nvidia’s hellscape.

      • By delusional 2025-09-1817:382 reply

        > This gives intel a chance at maintaining leading edge fab technologies.

        I don't think so:

        > The chip giant hasn’t disclosed whether it will use Intel Foundry to produce any of these products yet.

        It seems pretty likely this is an x86 licensing strategy for nvidia. I doubt they're going to be manufacturing anything on intel fabs. I even wonder if this is a play to get an in with Trump by "supporting" his nationalizing intel strategy.

        • By nickysielicki 2025-09-1817:491 reply

          nvidia doesn’t need x86, they’re moving forward on aarch64 and won’t look back. For example, one of the headlines from CUDA 13 is that sbsa can be targeted from all toolkits, not as a separate download, which is important for making it easy to target grace. They have c2c silicon on grace for native host side nvlink. They’re not looking back.

          • By delusional 2025-09-1821:021 reply

            They're clearly looking back though, investing in Intel and announcing quite substantial partnerships. Maybe they're not looking back for technical reasons, but they are looking back.

            • By benced 2025-09-2122:16

              I don't think Nvidia cares about the CPU ISA.

        • By benced 2025-09-2122:15

          I think literally just the cash is a big deal at this point. Additionally, this deal probably increases the chances that Nvidia at least uses some Intel Foundry technology (like packing) and maybe very down the road, fabrication.

      • By overfeed 2025-09-1816:411 reply

        > The likelihood intel AI was going to catch up with efforts like AWS Trainium, let alone Nvidia

        ...and yet Nvidia is not gambling with the odds. Intel could have challenged Nvidia on performance-per-dollar or per watt, even if they failed to match performance in absolute terms (see AMD's Zen 1 vs Intel)

        • By benced 2025-09-2122:16

          You misunderstand enterprise GPUs. Their cost of ownership is dominated by electricity - they're already optimized for price per watt.

    • By dagmx 2025-09-1815:32

      The regulators want this because it’s bolstering the last domestic owned fab.

      Any down the road repercussions be damned from their perspective.

    • By lvl155 2025-09-1815:452 reply

      Intel doesn’t deserve anything. They deserve to disappear based on how they ran the company as a monopoly. No lessons were learned.

      • By leoc 2025-09-1817:56

        That was quite a long time ago! Intel going down the chutes now isn’t an effective punishment for how it behaved under Andy Grove and won’t deter others from Grove-like behaviour. Instead it’ll just mean even less restraint on any of the big players with market power now, like nVidia, AMD and TSMC.

      • By iends 2025-09-1817:35

        This is likely true in a vacuum, but US national security concerns means the US needs Intel.

    • By usef- 2025-09-195:46

      This isn't about GPU competition, it's about fab competition (which is in far more dire of a situation).

      Intel can no longer fund new process nodes by itself, and no customers want to take the business risk to build their product on a (very difficult) new node when tsmc exists. They're in a chicken and egg situation. (see also https://stratechery.com/2025/u-s-intel/ )

    • By justincormack 2025-09-1815:363 reply

      Consumer gpus are totally different products from the high end gpus now. Intel has failed on the gpu market and has effectively zero market share, so it is not actually clear there is an antitrust issue in that market. It would be nice if there was more competition but there are other players like AMD and a long tail of smaller ones

      • By tw04 2025-09-1815:455 reply

        >Consumer gpus are totally different products from the high end gpus now. Intel has failed on the gpu market and has effectively zero market share, so it is not actually clear there is an antitrust issue in that market. It would be nice if there was more competition but there are other players like AMD and a long tail of smaller ones

        I'm sorry that's just not correct. Intel is literally just getting started in the GPU market, and their last several releases have been nearly exactly what people are asking for. Saying "they've lost" when the newest cards have been on the market for less than a month is ridiculous.

        If they are even mediocre at marketing, the Arc Pro B50 has a chance to be an absolute game changer for devs who don't have a large budget:

        https://www.servethehome.com/intel-arc-pro-b50-review-a-16gb...

        I have absolutely no doubt Nvidia sees that list of "coming features" and will do everything they can to kill that roadmap.

        • By raincole 2025-09-1816:44

          "Intel getting started in GPU market" is like a chain smoker quitting smoking. It's so easy that they have done it 20 times!

        • By tapland 2025-09-1817:181 reply

          The lastest Arc GPUs were doing good, and were absolutely an option for entry/mid level gamers. I think lack of maturity was one of the main things keeping sales down.

          • By Seattle3503 2025-09-1817:28

            I've been seeing a lot of homelab types recommending their video cards for affordable Plex transcoding as well.

        • By bpt3 2025-09-1816:271 reply

          Intel has been making GPUs for over 25 years. Claiming they are just getting started is absurd.

          To that point, they've been "just getting started" in practically every chip market other than x86/x64 CPUs for over 20 years now, and have failed miserably every time.

          If you think Nvidia is doing this because they're afraid of losing market share, you're way off base.

          • By cptskippy 2025-09-1821:251 reply

            There's a very big difference between the MVP graphics chips they've included in CPUs and the Arc discrete GPU.

            • By bpt3 2025-09-1821:451 reply

              Sure, but claiming they have literally just started is completely inaccurate.

              They've been making discrete GPUs on and off since the 80s, and this is at least their 3rd major attempt at it as a company, depending on how you define "major".

              They haven't even just started on this iteration, as the Arc line has been out since 2022.

              The main thing I learned from this submission is how much people hate Nvidia.

              • By cptskippy 2025-09-195:07

                > The main thing I learned from this submission is how much people hate Nvidia.

                I think there's a lot of frustration with Nvidia as of late. Their monopoly was mostly won on the merits of their technology but now that they are a monopoly they have shifted focus from building the best technology to building the most lucrative technology.

                They've demonstrated that they no longer have interested in producing the best gaming GPUs because those might cannibalize their server technology. Instead they seem to focus on crypto and AI while shipping over priced knee capped cards at outrageous prices.

                People are upset because they fear this deal will somehow influence Intel's GPU ambitions. Unfortunately I'm not sure these folks want to buy Intel GPUs, they just want Nvidia to be scared into competing again so they can buy a good Nvidia card.

                People just need to draw a line in the sand and stop supporting Nvidia.

        • By bigyabai 2025-09-1816:081 reply

            224 GB/s
          
            128 bit 
          
          The monkey's paw curls...

          I love GPU differentiation, but this is one of those areas where Nvidia is justified shipping less VRAM. With less VRAM, you can use fewer memory controllers to push higher speeds on the same memory!

          For instance, both the B50 and the RTX 2060 use GDDR6 memory. But the 2060 has a 192-bit memory bus, and enjoys ~336 GB/s bandwidth because of it.

        • By Sohcahtoa82 2025-09-1817:345 reply

          I don't know what anybody would do with such a weak card.

          My RTX 5090 is about 10x faster (measured by FP32 TFLOPS) and I still don't find it to be fast enough. I can't imagine using something so slow for AI/ML. Only 2.2 tokens/sec on an 8B parameter Llama model? That's slower than someone typing.

          I get that it's a budget card, but budget cards are supposed to at least win on a pure price/performance ratio, even with a lower baseline performance. The 5090 is 10x faster but only 6-8x the price, depending on where in the $2-3,000 price range you can find one at.

          • By dragonwriter 2025-09-1818:111 reply

            > My RTX 5090 is about 10x faster (measured by FP32 TFLOPS) and I still don't find it to be fast enough. I can't imagine using something so slow for AI/ML. Only 2.2 tokens/sec on an 8B parameter Llama model? That's slower than someone typing.

            Its also orders of magnitudr slower than what I normally see cited by people using 5090s; heck, its even much slower than I see on my own 3080Ti laptop card for 8B models, though usually won’t use more than an 8bpw quant for that size model.

            • By Sohcahtoa82 2025-09-1820:36

              Yeah, I must be doing something wrong. Someone else pointed out that I should be getting much better performance. I'll be looking into it.

          • By clifflocked 2025-09-1820:301 reply

            I feel as though you are measuring tokens/s wrong, or have a serious bottleneck somewhere. On my i5-10210u (no dedicated graphics, at standard clock speeds), I get ~6 tokens/s on phi4-mini, a 4b model. That means my laptop CPU with a power draw of 15 watts, that was released 6 years ago, is performing better than a 5090.

            > The 5090 is 10x faster but only 6-8x the price

            I don't buy into this argument. A B580 can be bought at MSRP for 250$. A RTX 5090 from my local Microcenter is around 3250$. That puts it at around 1/13th the price.

            Power costs can also be a significant factor if you choose to self-host, and I wouldn't want to risk system integrity for 3x the power draw, 13x the price, a melting connector, and Nvidia's terrible driver support.

            EDIT: You can get an RTX 5090 for around 2500$. I doubt it will ever reach MSRP though.

            • By AuryGlenz 2025-09-190:20

              You can get them for $2,000 now. One from Asus has been that price several times over the last few months. I got my PNY for 2200 or so.

          • By jpalawaga 2025-09-1817:511 reply

            you have outlier needs if an rtx, the fastest consumer grade card, is not good enough for you.

            the intel card is great for 1080p gaming. especially if you're just playing counterstrike, indie games, etc, you don't need a beast.

            very few people are trying to play 4k tombraider on ultra with high refresh rate.

            • By Sohcahtoa82 2025-09-1818:391 reply

              FWIW, my slowness is because of quantizing.

              I've been using Mistral 7B, and I can get 45 tokens/sec, which is PLENTY fast, but to save VRAM so I can game while doing inference (I run an IRC bot that allows people to talk to Mistral), I quantize to 8 bits, which then brings my inference speed down to ~8 tokens/sec.

              For gaming, I absolutely love this card. I can play Cyberpunk 2077 with all the graphics settings set to the maximum and get 120+ fps. Though when playing a much more graphically intense game like that, I certainly need to kill the bot to free up the VRAM. But I can play something simpler like League of Legends and have inference happening while I play with zero impact on game performance.

              I also have 128 GB of system RAM. I've thought about loading the model in both 8-bit and 16-bit into system RAM and just swap which one is in VRAM based on if I'm playing a game so that if I'm not playing something, the bot runs significantly faster.

              • By mysteria 2025-09-1819:121 reply

                Hold on, you're only getting 45 tokens/sec with Mistral 7B on a 5090 of all things? That gets ~240 tokens/sec with Llama 7B quantized to 4 bits on llama.cpp [1] and those models should be pretty similar architecturally.

                I don't know exactly how the scaling works here but considering how LLM inference is memory bandwidth limited you should go beyond 100 tokens/sec with the same model and a 8 bit quantization.

                1. https://github.com/ggml-org/llama.cpp/discussions/15013

                • By Sohcahtoa82 2025-09-1820:241 reply

                  My understanding is that quantizing lowers memory usage but increases compute usage because it still needs to convert the weights to fp16 on the fly at inference time.

                  Clearly I'm doing something wrong if it's a net loss in performance for me. I might have to look more into this.

                  • By mysteria 2025-09-1823:44

                    Yes it increases compute usage but your 5090 has a hell of a lot of compute and the decompression algorithms are pretty simple. Memory is the bottleneck here and unless you have a strange GPU which has lots of fast memory but very weak compute a quantized model should always run faster.

                    If you're using llama.cpp run the benchmark in the link I posted earlier and see what you get; I think there's something like it for vllm as well.

          • By adgjlsfhk1 2025-09-1819:48

            The B60 is ridiculously good for scientific workloads. it's 50% more fp64 flops than a 5090 and 3/4ths the VRAM for 1/4th the price.

          • By ohdeargodno 2025-09-1817:55

            [dead]

      • By realityking 2025-09-1816:05

        > it is not actually clear there is an antitrust issue in that market

        Preempting a (potential) future competitor from entering a market is also an antitrust issue.

      • By Dylan16807 2025-09-1817:051 reply

        Other than the market segmentation over RAM amounts, I don't see very much difference. There's some but there's been some for a long time. Isn't AMD re-unifying their architectures?

        • By 0x457 2025-09-1818:23

          > There's some but there's been some for a long time. Isn't AMD re-unifying their architectures?

          Yes.

          > Other than the market segmentation over RAM amounts, I don't see very much difference.

          The difference between CDNA and RDNA is pretty much how fast it can crunch FP64 and SR-IOV. Prior to RDNA, AMD GPUs were jacks of all trades with compute bias. Which made them bad for gaming unless the game is specifically written around async compute. Vega64 has more FP64 compute than the 4080 for context.

          I think if AMD was able to get a solid market share of datacenter GPUs, they wouldn't have unified. This feels like CDNA team couldn't justify its existence.

    • By bbarnett 2025-09-1818:12

      Does Nvidia now have controlling interest? A bunch of board seats?

      Why would it matter if not? This is a nice partnership. Each gets something the other lacks.

      And it strengthens domestic manufacturing. Taiwan is going to be subumed soon, and we need more domestic production now.

    • By aDyslecticCrow 2025-09-1818:56

      The alternative is currently looking like cutting up of intel into piecemeal to make a quick buck just to stay afloat. The GPU division is not profitable and may be destroyed if overall financials don't improve.

    • By JustExAWS 2025-09-1817:012 reply

      Right now, for the US national interests, our biggest concern is that Intel continues to exist. Intel has been making crappy GPUs for 25 years. They weren’t going to start making great GPUs now.

      Besides, who would actually use them if they don’t support CUDA?

      Everyone designs better GPUs than Intel - even Apple’s ARM GPUs have been outpacing Intel for a decade even before the M series.

      • By trenchpilgrim 2025-09-1817:171 reply

        > They weren’t going to start making great GPUs now.

        But that's exactly what they started doing with Battlemage? It's competitive in its price range and was showing generational strides.

        > Besides, who would actually use them if they don’t support CUDA?

        ML is starting to trend away from CUDA towards Vulkan, even on Nvidia hardware, for practical reasons (e.g. performance overhead).

        • By JustExAWS 2025-09-1817:28

          Intel has been trying to make decent GPUs for 25+ years. No company is going to invest billions buying Intel GPUs - especially not the hyper scalers.

      • By tensor 2025-09-1817:241 reply

        Why does it matter if Intel exists if they can't compete? AMD exists. The only point of hoping they remain is to create an environment of competition as that drives development and progress.

        Though fair and free markets is not at all what the current regime in the US believes in, instead it will be consolidation, leading waste, and little innovation and progress.

        • By JustExAWS 2025-09-1817:251 reply

          AMD doesn’t have a foundery. They are irrelevant.

          • By tensor 2025-09-1817:381 reply

            Well, I guess enjoy using your 3rd world Intel GPUs. A shitty foundery is irrelevant.

            • By JustExAWS 2025-09-1817:442 reply

              Intel isn’t that far behind. But it is dumb to depend on fabs in a country that is just one Chinese missile away from getting destroyed.

              • By iamtedd 2025-09-1818:101 reply

                That's most of the world, including the USA. https://en.wikipedia.org/wiki/China_and_weapons_of_mass_dest...

                • By raw_anon_1111 2025-09-1818:201 reply

                  So you don’t see the difference in the threat level of China bombing and invading Taiwan - which they already claim they own - and China attacking the US directly?

                  • By iamtedd 2025-09-1818:411 reply

                    I don't, because I'm not in the US. But my comment was in reply to the actual text of the grandparent, not some imagined subtext between the lines.

                    • By raw_anon_1111 2025-09-1819:31

                      So its just an imagined subtext that China that has been rabble rousing about taking over Taiwan is more likely to attack a tiny island nation right next to than attack the US?

              • By iszomer 2025-09-1818:081 reply

                And why would they when TSMC is in both China and the US in some fashion?

                • By raw_anon_1111 2025-09-1818:231 reply

                  And Taiwan is forbidding TSMC from building their cutting edge fabs in the US…

                  https://www.asiafinancial.com/taiwan-says-tsmc-not-allowed-t...

                  That may have changed since then. But do you really want to depend on a foreign government for chip manufacturing?

                  • By iszomer 2025-09-252:14

                    As Taiwan should, it's their prerogative. People often think when global policy changes abruptly everything stops; in reality, the contrary is true: supply chains and demands shift.

                    For what it's worth, its TSMC's expertise in semiconductor manufacturing that has been loaned to the US, not bought, settled, and forgotten.

    • By stevenally 2025-09-1818:12

      NVidia only owns 4% of Intel. They won't be able to dictate it's direction.

    • By ErigmolCt 2025-09-1816:24

      If anything, it might be more of a strategic retreat or a hedged bet

    • By alexnewman 2025-09-1817:40

      Wait what. Intel GPU+AI efforts. People had to come together to fund the abandoned Intel SW development team. Intel GPUs are great at what they do but they are no nvidia. I don't even think that was on the roamdap. Also you don't know what nvidia wants. Maybe they want to flood the low end to destroy AMD benefiting consumers. We just don't know

  • By scrlk 2025-09-1812:128 reply

    > For personal computing, Intel will build and offer to the market x86 system-on-chips (SOCs) that integrate NVIDIA RTX GPU chiplets. These new x86 RTX SOCs will power a wide range of PCs that demand integration of world-class CPUs and GPUs.

    https://www.intc.com/news-events/press-releases/detail/1750/...

    What’s old is new again: back in 2017, Intel tried something similar with AMD (Kaby Lake-G). They paired a Kaby Lake CPU with a Vega GPU and HBM, but the product flopped: https://www.tomshardware.com/news/intel-discontinue-kaby-lak...

    • By phkahler 2025-09-1813:485 reply

      I don't think this is Intel trying to save itself, it's nVidia. Intel GPUs have been in 3rd place for a long time, but their integrated graphics are widely available and come in 2nd place because nVidia can't compete in the x86 space. Intel graphics have been closing the gap with AMD and are now within what? A factor of 2 or less (1.5?)

      IMHO we will soon see more small/quiet PCs without a slot for a graphics card, relying on integrated graphics. nVidia has no place in that future. But now, by dropping $5B on Intel they can get into some of these SoCs and not become irrelevant.

      The nice thing for Intel is that they might be able to claim graphics superiority in SoC land since they are currently lagging in CPU.

      • By jonbiggums22 2025-09-1814:283 reply

        Way back in the mid-late 2000s Intel CPUs could be used with third party chipsets not manufactured by Intel. This had been going on forever but the space was particularly wild with Nvidia being the most popular chipset manufacturer for AMD and also making in-roads for Intel CPUs. It was an important enough market than when ALi introduced AMD chipsets that were better than Nvidia's they promptly bought the company and spun down operations.

        This was all for naught as AMD purchased ATi, shutting out all other chipsets and Intel did the same. Things actually looked pretty grim for Nvidia at this point in time. AMD was making moves that suggested APUs were the future and Intel started releasing platforms with very little PCIe connectivity, prompting Nvidia to build things like the Ion platform that could operate over an anemic pcie 1x link. There were really were the beginnings of strategic moves to lock Nvidia out of their own market.

        Fortunately, Nvidia won a lawsuit against Intel that required them to have pcie 16x connectivity on their main platforms for 10 years or so and AMD put out non-competitive offerings in the CPU space such that the APU take off never happened. If Intel had actually developed their integrated GPUs or won that lawsuit or if AMD had actually executed Nvidia might well be an also-ran right around now.

        To their credit, Nvidia really took advantage of their competitors inability to press their huge strategic advantage during that time. I think we're in a different landscape at the moment. Neither AMD nor Intel can afford boot Nvidia since consumers would likely abandon them for whoever could still slot in an Nvidia card. High performance graphics is the domain of add-in boards now and will be for awhile. Process node shrinks aren't as easy and cooling solutions are getting crazy.

        But Nvidia has been shut out of the new handheld market and haven't been a good total package for consoles as SoC both rule the day in those spaces so I'm not super surprised at the desire for this pairing. But I did think nvidia had given up these ambitions was planning to try to build an adjacent ARM based platform as a potential escape hatch.

        • By to11mtm 2025-09-1817:123 reply

          > It was an important enough market than when ALi introduced AMD chipsets that were better than Nvidia's they promptly bought the company and spun down operations.

          This feels like a 'brand new sentence' to me because I've never met an ALi chipset that I liked. Every one I ever used had some shitty quirk that made VIA or SiS somehow more palatable [0] [1].

          > Intel started releasing platforms with very little PCIe connectivity,

          This is also a semi-weird statement to me, in that it was nothing new; Intel already had an established history of chipsets like the i810, 845GV, 865GV, etc which all lacked AGP. [2]

          [0] - Aladdin V with it's AGP Instabilities, MAGiK 1 with it's poor handling of more than 2 or 3 'rows' of DDR (i.e. two double-sided sticks of DDR turned it into a shitshow no matter what you did to timings. 3 usually was 'ok-ish' and 2 was stable.)

          [1] - SIS 730 and 735 were great chipsets for the money and TBH the closest to the AMD760 for stability.

          [2] - If I had a dollar for every time I got to break the news to someone that there was no real way to put a Geforce or 'Radon' [3] in their eMachine, I could have had a then-decent down payment for a car.

          [3] - Although, in an odd sort of foreshadowing, most people who called it a 'Radon', would specifically call it an AMD Radon... and now here we are. Oddly prescient.

          • By hakfoo 2025-09-196:25

            I'm thinking the era of "great ALI chipsets" was more after they became ULi in the Athlon 64 era.

            I had a ULi M1695 board (ASRock 939SLI32-eSATA2) and it was unusual for the era in that it was a $90 motherboard with two full x16 slots. Even most of the nForce boards at the time had it set up as x8/x8. For like 10 minutes you could run SLI with it until nVidia deliberately crippled the GeForce drivers to not permit it, but I was using it with a pretty unambitious (but fanless-- remember fanless GPUs?) 7600GS.

            They also did another chipset pairing that offered a PCI-Ex16 slot and a fairly compatible AGP-ish slot for people who had bought an expensive (which then meant $300 for a 256MB card) graphics card and wanted to carry it over. There were a few other boards using other chipsets (maybe VIA) that tried to glue together something like that, but the support was much more hit-or-miss.

            OTOH, I did have an Aladdin IV ("TXpro") board back in the day, and it was nice because it supported 83MHz bus speeds when a "better" Intel TX board wouldn't. A K6-233 overclocked to 250 (3x83) was detectably faster than at 262 (3.5x75)

          • By RulerOf 2025-09-204:47

            > most people who called it a 'Radon', would specifically call it an AMD Radon

            If I had a dollar for the number of times I heard an IT professional say "Intel Xenon" I'd probably match your down payment.

          • By jonbiggums22 2025-09-1819:11

            ALi was indeed pretty much on the avoid list for me for most of their history. It was only when they came out with the ULi M1695 made famous by the Asrock939dual-sata2 that they were a contender for best out of nowhere. One of the coolest boards I ever owned and was rock solid for me even with all of the weird configs I ran on it. I kind of wish I hadn't sold it even today!

            I remember a lot disappointed people on forums who couldn't upgrade their cheap PCs as well, but there were still motherboards available with AGP to slot into for Intel's best products. Intel couldn't just remove it from the landscape altogether (assuming they wanted to) because they weren't the only company making Intel supporting chipsets. IIRC Intel/AMD/Nvidia were not interested in making AGP+PCIe supporting chipsets at all, but VIA/ALi and maybe SiS made them instead because it was a free for all space still. Once that went away Nvidia couldn't control their own destiny.

        • By ninetyninenine 2025-09-1814:451 reply

          nvidia does build SOCs already. The AGXs and other offerings. I'm curious why they want intel despite having that technical capability of building SOCs.

          I realize the AGX is more of a low power solution and it's possible that nvidia is still technically limited when building SOCs but this is just speculation.

          Does anybody know actual ground truth reasoning why Nvidia is buying Intel despite the fact that nvidia can make their own SOCs?

          • By KeplerBoy 2025-09-1820:38

            Why is Nvidia partnering with Mediatek for CPU cores in the dgx spark? Different question, probably the same answer.

        • By whatevaa 2025-09-1814:591 reply

          Nvidia just doesn't care about console and handheld markets. They are unwilling to make customisations and it's low margin business.

      • By wirybeige 2025-09-1813:512 reply

        Xe2 is superior to current AMD integrated already

        • By yujzgzc 2025-09-1814:03

          I think the comparison was between Nvidia standalone graphics chips and Intel integrated graphics capabilities.

      • By mrheosuper 2025-09-192:42

        > Intel graphics have been closing the gap with AMD and are now within what? A factor of 2 or less (1.5?)

        Apart from that APU(395+) from AMD, intel iGPU is on par with AMD right now.

        The 395+ is more like dGPU and CPU on same die.

      • By SilverbeardUnix 2025-09-1816:22

        Intel hasn't had desktop GPUs for a long time. Your timescale is off compared to how long AMD and Nvidia have had to polish their GPUs.

      • By edm0nd 2025-09-1820:39

        I'm curious, why do you type it as nVidia instead of NVIDIA or Nvidia ?

    • By joz1-k 2025-09-1813:153 reply

      RIP Arc and Gaudi. There is no other way how to read this. Fewer competitors => higher prices.

      • By jonbiggums22 2025-09-1814:31

        I think it is bad news for the GPU market (AMD has had a beachhead with their integrated solution here as they've lost out elsewhere) but good for x86 which I've worried would be greatly diminished as Intel became less competitive.

      • By numpad0 2025-09-1817:07

        I just realized there's worse possibility. They might offer it as successor to xx50/60 RTX GPUs to unsupport CUDA on low ends.

      • By philistine 2025-09-1814:24

        Absolutely. This is terrible news for high emission gamers, who have been living under the boot of Nvidia for decades.

    • By ddalex 2025-09-1812:212 reply

      That was targeted at supporting more tightly integrated and performant Macbooks .... it flopped because Apple came up with M1, not because it was bad per se.

      • By JonChesterfield 2025-09-1812:25

        The ryzen APUs had a rocky start but are properly good now, the concept is sound

      • By intvocoder 2025-09-1812:241 reply

        apple never shipped a product with that, but it made for an excellent hackintosh

    • By linuxftw 2025-09-1812:38

      To me, this just validates what AMD has been doing for over a decade. Integrated GPUs for personal computing are the way forward.

    • By mrheosuper 2025-09-192:41

      >They paired a Kaby Lake CPU with a Vega GPU and HBM, but the product flopped.

      Not sure if it's flopped, because the only machine with that CPU i could find is the intel nuc.

    • By 22c 2025-09-193:28

      > What’s old is new again

      Let's go back even further.. I get strong nForce vibes from that extract!

    • By newsclues 2025-09-1816:20

      Stick some CUDA cores on the CPU and market is for AI?

    • By herodoturtle 2025-09-1813:08

      > Intel tried something similar with AMD (Kaby Lake-G). They paired a Kaby Lake CPU with a Vega GPU and HBM, but the product flopped

      /me picturing Khaby Lame gesturing his hands at an obvious workaround.

HackerNews