You too can run malware from NPM (I mean without consequences)

2025-09-0910:02195114github.com

Contribute to naugtur/running-qix-malware development by creating an account on GitHub.

Phishing NPM package authors continues, unsurprisingly. The stakes are not high enough to switch from phishing to anything more advanced (like https://xkcd.com/538/) but seeing article blurbs say "Supply chain Attack" next to "These packages generally receive 2-3 billion downloads per week." might finally be enough to make an impression, one hopes.

This is not a detailed analysis of the attack, there's plenty of that already. If you're looking for one, visit our friends at https://socket.dev/blog/npm-author-qix-compromised-in-major-supply-chain-attack

Instead, let's look at how you could have a compromised dependency like that get into your app and be stopped.

One of the compromised packages was is-arrayish and I'll use that as an example going forward.

So if an app uses is-arrayish in the browser, it will override fetch, XMLHttpRequest and window.ethereum.request and whenever it finds a transaction being sent, it'll replace the target address with one of the malware author's addresses that looks most alike.

I won't go into this either, but yuo can take a look at the summary of "donations" some other friends linked to here: https://intel.arkm.com/explorer/entity/61fbc095-f19b-479d-a037-5469aba332ab

Pretty low impact for an attack this big. Some of it seems to be people mocking the malware author with worthless transfers.

Say we have an app. The app allows the user to send a meaningless transaction to themselves. Don't expect it to make sense. It also uses is-arrayish because otherwise we'd have nothing to demo.

const isArrayish = require("is-arrayish"); const button = document.createElement("button");
button.textContent = "Send ETH Transaction";
document.body.appendChild(button); button.addEventListener("click", async () => { const accounts = await window.ethereum.request({ method: "eth_requestAccounts", }); if (!isArrayish(accounts)) { throw new Error("Accounts response must be array-like"); } const myAddr = accounts[0]; const txHash = await window.ethereum.request({ method: "eth_sendTransaction", params: [ { value: "0x5af3107a4000", from: myAddr, to: myAddr, }, ], }); console.log("Transaction sent:", txHash);
});

This is what it looks like:

Now after you update is-arrayish to 0.3.3 and rebuild the project, you might notice a slight difference.

If your project was set up with LavaMoat, you'd be using a policy to decide which package is allowed access to what. More about policies in the guide

With LavaMoat, all is-arrayish can do is fail:

TypeError: Cannot define property fetch, object is not extensible

BTW, If the malware was written a little better to avoid detection and fail silently, the functionality of the app would be fully restored.

To protect the project, @lavamoat/webpack was used.

In short, what it does is: it puts modules from every dependency in a separate lexical globl context that we call Compartment and only allows access to globals that the policy lists. I also controls which packages can import which other packages.

If the project dependency gets updated to contain malicious code, the policy will not allow it to access any globals or imports it didn't use before.

Read more in the official guide


Read the original article

Comments

  • By jefozabuss 2025-09-0912:366 reply

    Seems like people already forgot about Jia Tan.

    By the way why doesn't npm have already a system in place to flag sketchy releases where most of the code looks normal and there is a newly added obfuscated code with hexadecimal variable names and array lookups for execution...

    • By mystifyingpoi 2025-09-0913:155 reply

      Detecting sketchy-looking hex codes should be pretty straightforward, but then I imagine there are ways to make sketchy code non-sketchy, which would be immediately used. I can imagine a big JS function, that pretends to do legit data manip, but in the process creates the payload.

      • By hombre_fatal 2025-09-0913:34

        Yeah, It’s merely a fluke that the malware author used some crappy online obfuscator that created those hex code variables. It would have been less work and less suspicious if they just kept their original semantic variables like “originalFetch”.

      • By nicce 2025-09-0913:23

        It is just about bringing the classic non-signature based antivirus software to the release cycle. Hard to say how useful it is, but usually it is endless cat-and-mouse play like with everything else.

      • By Cthulhu_ 2025-09-0914:08

        It wouldn't be just one signal, but several - like a mere patch version that adds several kilobytes of code, long lines, etc. Or a release after a long silent period.

      • By cluckindan 2025-09-0915:191 reply

        A complexity per line check would have flagged it.

        Even a max line length check would have flagged it.

        • By chatmasta 2025-09-0915:583 reply

          That would flag a huge percentage of JS packages that ship with minified code.

          • By cluckindan 2025-09-109:28

            Why would you be including minified code in a build? That’s just bad practice and makes development-time debugging more difficult.

          • By saghm 2025-09-1016:37

            It's not like minified JS can't be parsed and processed as AST. You could still pretty easily split up each statement/assignment to check the length of each one individually.

          • By jay_kyburz 2025-09-0923:342 reply

            How are people verifying their dependencies if they are minified?

            • By SethTro 2025-09-100:22

              That's the magic part, they aren't

            • By chatmasta 2025-09-101:191 reply

              My guy… in the JS ecosystem a “lock file” is something that restricts your package installer to an arbitrary range of packages, i.e. no restrictions at all and completely unpredictable. You have to go out of your way to “pin” a package to a specific version.

              • By Izkata 2025-09-101:322 reply

                Lockfiles use exact hashes, not versions/version ranges. Javascript projects use two files, a package file with version ranges (used when upgrading) and a lockfile with the exact version (used in general when installing in an existing project).

                • By chatmasta 2025-09-102:441 reply

                  Sure, but a lockfile with a hash doesn’t mean that next time it will fail if it tries to install a version of the package without that hash. If your package.json specifies a semver range then it’ll pull the latest minor or patch version (which is what happened in this case with e.g. duckdb@1.3.3) and ignore any hash differences if the version has changed. Hence why I say you need to go out of your way to specify an exact version in package.json and then the lock file will work as you might expect a “lock” file to work. (Back when I was an engineer and not a PM with deteriorating coding ability, I had to make a yarn plugin to pin each of our dependencies.)

                  The best way to manage JS dependencies is to pin them to exact versions and rely on renovate bot to update them. Then at least it’s your choice when your code changes. Ideally you can rebuild your project in a decade from now. But if that’s not possible then at least you should have a choice to accept or decline code changes in your dependencies. This is very hard to achieve by default in the JS ecosystem.

                  • By jay_kyburz 2025-09-102:522 reply

                    I think at some point you would be better off vendoring them in.

                    • By chatmasta 2025-09-103:09

                      That’s effectively what I did in a very roundabout way with docker images and caching that ended up abusing the GitLab free tier for image hosting. When you put it like that it does make me think there was a simpler solution, lol.

                      When I’m hacking on a C project and it’s got a bunch of code ripped out of another project, I’m like “heh, look at these primordial dependency management practices.” But five years later that thing is gonna compile no problem…

                    • By cluckindan 2025-09-109:29

                      There’s even a command for that: npm pack

                • By zdragnar 2025-09-101:51

                  NPM is rather infamous for not exactly respecting the lockfile, however.

      • By cchance 2025-09-0916:55

        Feels like a basic light weight 3b AI model could easily spot shit like this on commit

    • By tom1337 2025-09-0912:502 reply

      It would also be great if a release needs to be approved by the maintainer via a second factor or an E-Mail verification. Once a release has been published to npm, you have an hour to verify it by clicking a link in an email and then enter another 2FA (separate OTP than for login, Passkey, Yubikey whatever). That would also prevent publishing with lost access keys. If you do not verify the release within the first hour it gets deleted and never published.

      • By naugtur 2025-09-0913:04

        That's why we never went with using keys in CI for publishing. Local machine publishing requires a 2fa.

        automated publishing should use something like Pagerduty to signal that a version is being published to a group of maintainers and it requires an approval to go through. And any one of them can veto within 5 minutes.

        But we don't have that, so gotta be careful and prepare for the worst (use LavaMoat for that)

      • By Cthulhu_ 2025-09-0914:10

        Not through e-mail links though, that's what caused this in the first place. E-mail notification, sure, but they should also do a phishing training mail - make it legit, but if people press the link they need to be told that NPM will never send them an email with a link.

    • By dist-epoch 2025-09-0913:141 reply

      > flag sketchy releases

      Because the malware writers will keep tweaking the code until it passes that check, just like virus writers submit their viruses to VirusTotal until they are undetected.

      • By galaxy_gas 2025-09-0921:051 reply

        its Typical that the Virus Writer will use their own service, there is criminal virustotal-clones that run many AV in VM and return the Results, because virustotal will share all binaries, anything upload in Virustotal will be detteceted shortly if it is not.

        • By 47282847 2025-09-109:12

          Isn’t it still that when signatures are added at some point it turns out that the malware code has been uploaded months before, or did that change?

    • By AtNightWeCode 2025-09-0914:451 reply

      The problem is that it is even possible to push builds from dev machines.

    • By hulitu 2025-09-1018:29

      > By the way why doesn't npm have already a system in place to flag sketchy releases

      Because nobody gives a fsck. Normally, after npm was filled with malware, people would avoid it. But it seems that nobody (distro maintainers) cares. People get what they asked for (malware).

  • By CyberMacGyver 2025-09-0912:081 reply

    Looks like OP is one of the contributors to LavaMoat

    • By naugtur 2025-09-0913:072 reply

      Yes, I am. I came up with the first successful attempt at integrating the Principle of Least Authority software in LavaMoat with Webpack and wrote the LavaMoat Webpack Plugin.

      Also, together with a bunch of great folks at TC39 we're trying to get enough building blocks for the same-realm isolation primitives into the language.

      see hardenedjs.org too

      I'm doing the rounds promoting the project today because at this point all we need to eliminate certain types of malware is get LavaMoat a lot more adoption in the ecosystem.

      ( and that'll give me bug reports and maybe even contributions? :) )

      • By hn92726819 2025-09-0914:011 reply

        I think most people are fine with promoting a cool project you work on, but it's best practice to disclose that in the article. Even something like "If your project was set up with LavaMoat (a project I've been working on), ..." would be enough.

        I think that's why they made the comment.

        • By naugtur 2025-09-0914:171 reply

          Yup, and thanks - I should have made the comment myself but got distracted.

          • By EasyMark 2025-09-0914:29

            You're forgiven. Thanks to you (and any other contribs) for the excellent project

      • By btown 2025-09-0914:061 reply

        I'm often curious about how effective runtime quasi-sandboxing is in practice (at least until support at the TC39 level lands).

        My understanding is that if you can run with a CSP that prevents unsafe-eval, and you lock a utility package down to not be able to access the `window` object, you can prevent it from messing with, say, window.fetch.

        But what about a package that does assume the existence of window or globalThis? Say, a great many packages bridging non-React components into the React ecosystem. Once a package needs even read-only access to `window`, how do you protect against supply-chain attacks on that package? Even if you read-only proxy that object, for instance, can you ensure that nothing in `window` itself holds a reference to the non-proxied `window`?

        Don't get me wrong - this project is tremendously useful as defense-in-depth. But curious about how much of a barrier it creates in practice against a determined attacker.

        • By naugtur 2025-09-0915:431 reply

          It's based on HardenedJS.org

          The sandbox itself is tight, there's a bug bounty even.

          The same technology is behind metamask snaps - plugins in a browser extension.

          And Moddable has their own implementation

          The biggest problem is endowing too powerful capabilities.

          We've got ambitious plans for isolating DOM, but that already failed once before.

          • By 1oooqooq 2025-09-100:33

            so to answer the actual question. if something expects too much browser/dom access to work, it won't?

  • By mohsen1 2025-09-0911:335 reply

    npm should take responsibility and up their game here. It’s possible to analyze the code and mark it as suspicious and delay the publish for stuff like this. It should prevent publishing code like this even if I have a gun to my head

    • By azemetre 2025-09-0912:45

      Why would npm care? They're basically a monopoly in the JS world and under the stewardship of a company that doesn't even care when its host nation gets hacked when using their software due to their ineptitude.

    • By sesm 2025-09-0911:464 reply

      I think malware check should be opt-in for package authors, but provide some kind of 'verified' badge to the package.

      Edit: typo

      • By yjftsjthsd-h 2025-09-0914:34

        > but provide some kind of 'verified' badge to the package

        I would worry that that results in a false sense of security. Even if the actual badge says "passes some heuristics that catch only the most obvious malicious code", many people will read "totally 100% safe, please use with reckless abandon".

      • By Cthulhu_ 2025-09-0914:132 reply

        I always thought this would be the ideal monetization path for NPM; enterprises pay them, NPM only supplies verified package releases, ideally delayed by hours/days after release so that anything that slips through the cracks has a chance to get caught.

        • By chrisweekly 2025-09-0914:21

          Enterprises today typically use a custom registry, which can include any desired amount of scans and rigorous controls.

        • By johannes1234321 2025-09-0918:09

          That would put them into liability or be a quite worthless agreement taking no responsibility.

      • By naugtur 2025-09-0911:581 reply

        npm is on life support by msft. But there's socket.dev that can tell you if a package is malicious within hours of it being published.

        • By shreddit 2025-09-0912:082 reply

          “within hours” is at least one hour too late, and most likely multiple hours.

          • By naugtur 2025-09-0912:312 reply

            Absolutely not. you get npm packages by pulling not them pushing them to you as soon as a new version exist. The likelyhood of you updating instantly is close to zero and if not, you should set your stuff up so that it is. Many ways to do that. Even better if compared to a month or two - which is how long it often takes for a researcher to find a carefully planted malware.

            Anyway, the case where reactive tools (detections, warnings) don't catch it is why LavaMoat exists. It prevents whole classes of malware from working at runtime. The article (and repo) demonstrates that.

            • By rs186 2025-09-0913:291 reply

              Sure, it should never happen in CI environment. But I bet that every second, someone in the world is running "npm install" to bring in a new dependency to a new/existing project, and the impact of a malicious release can be broad very quickly. Vibe coding is not going to slow this down.

              • By naugtur 2025-09-0913:58

                Vibe coding brings up the need for even more granular isolation. I'm on it ;)

                LavaMoat Webpack Plugin will soom have the ability to treat parts of your app same as it currently treats packages - with isolation and policy limiting what they can do.

            • By bavarianbob 2025-09-0914:341 reply

              I've worked in software supply chain security for two years now and this is an extremely optimistic take. Nearly all organizations are not even remotely close to this level of responsiveness.

              • By naugtur 2025-09-1518:05

                Again, that's why LavaMoat exists. Set it up once and it will block many classes of attacks regardless of where they come from.

          • By Cthulhu_ 2025-09-0914:11

            Depends on whether they hold publishing to the main audience until said scan has finished.

    • By untitaker_ 2025-09-0911:56

      i can guarantee you npm will externalize the cost of false-positive malware scans to package authors.

    • By nodesocket 2025-09-0912:032 reply

      Or at a minimum support yubikey for 2fa.

      • By mcintyre1994 2025-09-0914:34

        They do, I use a yubikey and it requires me to authenticate with it whenever I publish. They do support weaker 2fa methods as well, but you can choose.

      • By worthless-trash 2025-09-0912:191 reply

        Original author could be evil. 2fa does nothing.

        • By jamesnorden 2025-09-0914:331 reply

          If my grandma had wheels she'd be a bike. You don't need to attack the problem from only one angle.

          • By worthless-trash 2025-09-0914:431 reply

            Your grandma is a bike then. The 2fa is going to solve nothing and any attacker worth their salt knows it.

HackerNews