Fastest Front End Tooling for Humans and AI

2026-02-1811:5111093cpojer.net

Frontend tooling in 2026+, with and without AI.

2026 is the year JavaScript tooling gets faster. TypeScript is being rewritten in Go, and tools like Oxlint and Oxfmt are getting ready for mass adoption. Humans and LLMs both perform much better in codebases that have a fast feedback loop, strict guardrails, and strong local reasoning. This post aims to help everyone go faster with sensible and strict defaults.1

If you are bored of reading blog posts, you can also watch my Building Scalable Applications talk, send this post directly to your LLM, or get started with one of these templates:

Here is how you can speed up your stack:

tsgo: TypeScript Go

I’ve been using TypeScript’s Go rewrite for the past six months for ~10x faster type checking. There were a few hiccups along the way, but it’s now mostly stable and feature-complete, including editor support.

The main concern I had about switching to an experimental version of TypeScript was regressions to the type checking behavior. However, the opposite was true: tsgo caught type errors that the JavaScript implementation didn’t catch! I adopted tsgo in 20+ projects ranging from 1,000 to 1,000,000 lines of code, and it has improved iteration speed quite a bit.

If you want to migrate to tsgo and currently use TypeScript to compile your code, I recommend first switching to tsdown for libraries or Vite for web apps. tsdown is a fast bundler for libraries based on Rolldown that optimizes your JavaScript bundles.

Then, the migration to tsgo is quick:

  • npm install @typescript/native-preview
  • Remove any legacy TypeScript config flags
  • Replace every call to tsc with tsgo
  • Add "typescript.experimental.useTsgo": true to your VS Code settings

Prettier → Oxfmt

I’ve been using Prettier since it was in alpha. Many formatters have been built since then, but none had the feature coverage and plugin system of Prettier. Oxfmt is a great alternative because it has many of Prettier’s plugins built in, such as import and Tailwind CSS class sorting, and it falls back to Prettier for formatting the long tail of languages other than JavaScript/TypeScript.

Migration Prompt:

Migrate this project from Prettier to Oxfmt. Read https://oxc.rs/docs/guide/usage/formatter/migrate-from-prettier.md. Update all scripts, tools, and hooks to use Oxfmt. Remove all Prettier configuration files and reformat the code using Oxfmt.

I recommend installing the Oxc VS Code extension via code --install-extension oxc.oxc-vscode.

ESLint → Oxlint

Similar to Prettier, there have been many attempts to build new linters. However, the plugin ecosystem around ESLint is hard to beat. Even after I adopted a Rust-based linter, I had to keep using ESLint for lint rules such as the React Compiler plugin.

Oxlint is the first new linter that can run ESLint plugins directly via an ESLint plugin shim and NAPI-RS. Oxlint also supports TypeScript configuration files and you can use extends to compose your configuration:

import nkzw from '@nkzw/oxlint-config';
import { defineConfig } from 'oxlint';

export default defineConfig({
 extends: [nkzw],
});

Migration Prompt:

Migrate this project from ESLint to Oxlint. Read https://oxc.rs/docs/guide/usage/linter/migrate-from-eslint.md. Update all scripts, tools, and hooks to use Oxlint. Remove all ESLint configuration files. Lint the code and fix any lint errors.

Oxlint also supports type-aware lint rules. Install oxlint-tsgolint alongside Oxlint and run oxlint --type-aware. You can even check types directly via oxlint --type-aware --type-check, powered by TypeScript Go!

@nkzw/oxlint-config

A few weeks ago I asked GPT 5.2 Codex to convert a codebase from one UI framework to another in an empty Git repository. Then I gave it this Web App Template and asked it to do the same conversion in a fresh session. Through the strict guardrails, it did a significantly better job with fewer bugs.

If you aren’t starting a project from scratch with the above template, you can use @nkzw/oxlint-config to get a fast, strict, and comprehensive linting experience out of the box that guides LLMs to write better code with these principles:

  • Error, Never Warn: Warnings are noise and tend to get ignored. Either it’s an issue, or it isn’t. This config forces developers to fix problems or explicitly disable the rule with a comment.
  • Strict, Consistent Code Style: When multiple approaches exist, this configuration enforces the strictest, most consistent code style, preferring modern language features and best practices.
  • Prevent Bugs: Problematic patterns such as instanceof are not allowed, forcing developers to choose more robust patterns. Debug-only code such as console.log or test.only is disallowed to avoid unintended logging in production or accidental CI failures.
  • Fast: Slow rules are avoided. For example, TypeScript’s noUnusedLocals check is preferred over no-unused-vars.
  • Don’t get in the way: Subjective or overly opinionated rules (e.g. style preferences) are disabled. Autofixable rules are preferred to reduce friction and to save time.

I believe @nkzw/oxlint-config is the first package that brings together a comprehensive set of strict built-in and JS plugins for Oxlint. Give it a try!

Migration Prompt:

Migrate this project from ESLint to Oxlint using @nkzw/oxlint-config. Read https://raw.githubusercontent.com/nkzw-tech/oxlint-config/refs/heads/main/README.md and https://oxc.rs/docs/guide/usage/linter/migrate-from-eslint.md. Update all scripts, tools and hooks to use Oxlint. Remove all ESLint configuration files.

Smaller DevX Optimizations

npm-run-all2

I still like npm-run-all22 to parallelize scripts for fast local runs:

"scripts": {
 "lint:format": "oxfmt --check",
 "lint": "oxlint",
 "check": "npm-run-all --parallel tsc lint lint:format",
 "tsc": "tsgo"
}

There are many complex tools and some package managers have parallelization built-in, but for small things this package works surprisingly well:

  • It doesn’t add its own logging overhead.
  • It doesn’t tear and interleave output from different jobs. It only prints the output of one job at a time.
  • It exits as soon as one job fails.
  • When you type ctrl+c, it actually shuts everything down immediately.

ts-node

While there are many solutions now to run TypeScript during development, I still haven’t found one that supports all of TypeScript (JSX, enums, etc.) and is faster than nodemon, ts-node, and swc combined for running Node.js servers that instantly restart on file changes:

pnpm nodemon -q -I --exec node --no-warnings --experimental-specifier-resolution=node --loader ts-node/esm --env-file .env index.ts

And in your tsconfig.json:

"ts-node": {
 "transpileOnly": true,
 "transpiler": "ts-node/transpilers/swc",
 "files": true,
 "compilerOptions": {
 "module": "esnext",
 "isolatedModules": false
 }
}

I auto-save as I type (on the days I’m still coding by hand). When the changes affect a Node.js service, I want it to restart instantly on every keypress. I feel like I have tried everything under the sun and nothing comes close to being as fast as this combination. If you know of one that doesn’t have any trade-offs, please DM me.

Still great

It’s worth mentioning the tools that I still use every day since the last time I wrote about this topic.

pnpm

pnpm is the best package manager for JavaScript. It’s fast and full-featured.

Vite

I can’t imagine starting a web project with a bundler and dev server other than Vite. It’s the fastest, most stable, and most extensible platform to build for the web. Soon it’ll be even faster with Rolldown under the hood.

React

I’ve tried various UI frameworks but I keep coming back to React. The React Compiler keeps it fast, and Async React keeps it modern. I recently built fate, a modern data client for React & tRPCTry it!

JavaScript tools need to be fast, stable, and feature-complete. There have been many attempts in recent years to build new tools, but they all required compromises. With the new tools above, you won’t have to compromise.3


Read the original article

Comments

  • By conartist6 2026-02-1814:186 reply

    It's funny to me that people should look at this situation and say "this is OK".

    The upshot of all these projects to make JS tools faster is a fractured ecosystem. Who if given the choice would honestly want to try to maintain Javascript tools written in a mixture of Rust and Go? Already we've seemingly committed to having a big schism in the middle. And the new tools don't replace the old ones, so to own your tools you'll need to make Rust, Go, and JS all work together using a mix of clean modern technology and shims into horribly legacy technology. We have to maintain everything, old and new, because it's all still critical, engineers have to learn everything, old and new, because it's all still critical.

    All I really see is an explosion of complexity.

    • By dfabulich 2026-02-1819:321 reply

      So, what's your counterproposal?

      Each of these tools provides real value.

      * Bundlers drastically improve runtime performance, but it's tricky to figure out what to bundle where and how.

      * Linting tools and type-safety checkers detect bugs before they happen, but they can be arbitrarily complex, and benefit from type annotations. (TypeScript won the type-annotation war in the marketplace against other competing type annotations, including Meta's Flow and Google's Closure Compiler.)

      * Code formatters automatically ensure consistent formatting.

      * Package installers are really important and a hugely complex problem in a performance-sensitive and security-sensitive area. (Managing dependency conflicts/diamonds, caching, platform-specific builds…)

      As long as developers benefit from using bundlers, linters, type checkers, code formatters, and package installers, and as long as it's possible to make these tools faster and/or better, someone's going to try.

      And here you are, incredulous that anyone thinks this is OK…? Because we should just … not use these tools? Not make them faster? Not improve their DX? Standardize on one and then staunchly refuse to improve it…?

      • By conartist6 2026-02-1820:073 reply

        I'm being a little coy because I do have a very detailed proposal.

        In want the JS toolchain to stay written in JS but I want to unify the design and architecture of all those tools you mentioned so that they can all use a common syntax tree format and so can share data, e.g. between the linter and the formatter or the bundler and the type checker.

        • By notnullorvoid 2026-02-1821:201 reply

          Yeah it's a shame that few people realize running 3 (or more) different programs that have separate parsing and AST is the bigger problem.

          • By conartist6 2026-02-1821:41

            Not just because of perf (though the perf aspect is annoying) but because of how often the three will get out of sync and produce bizarre results

        • By nicoburns 2026-02-1822:431 reply

          Hasn't that already been tried (10+ years ago) with projects like https://github.com/jquery/esprima ? Which have since seen usage dramatically reduced for performance reasons.

          • By conartist6 2026-02-1823:31

            Yeah, you are correct. But that means I have the benefit of ten years development in the web platform, as well as having hindsight on the earlier effort.

            I would say the reason the perf costs feel bad there is that the abstraction was unsuccessful. Throughtput isn't all that big a deal for a parser at all if you only need to parse the parts of the code that have actually changed

        • By 9dev 2026-02-1822:561 reply

          You can rip fast builds from my cold, dead hands. I’m not looking back to JS-only tooling, and I was there since the gulp days.

          • By conartist6 2026-02-1823:401 reply

            All I can say for sure is that the reason the old tools were slow was not that the JS runtime is impossible to build fast tools with.

            And anyway, these new tools tend to have a "perf cliff" where you get all the speed of the new tool as long as you stay away from the JS integration API sued to support the "long tail" of uses cases. Once you fall off the cliff though, you're back to the old slow-JS cost regime...

            • By 9dev 2026-02-1910:201 reply

              > […] the reason the old tools were slow was not that the JS runtime is impossible to build fast tools with.

              I don't have them at hand right now but there are various detailed write-ups from the maintainers of Vite, oxc, and more, that are addressing this specific argument to point out that indeed the JavaScript runtime was a hard limitation on the throughput they could achieve, making Rust a necessity to improve build speeds.

              • By conartist6 2026-02-1911:42

                Why do you need high throughput though? Isn't that a metric of how fast a batch processing system is?

                Why are we still treating batch processing as the controlling paradigm for tools that work on code. If we fully embraced incremental recomputation and shifted the focus to how to avoid re-doing the same work over and over, batch processing speed would become largely irrelevant as a metric

    • By CodingJeebus 2026-02-1815:281 reply

      > We have to maintain everything, old and new, because it's all still critical, engineers have to learn everything, old and new, because it's all still critical.

      I completely agree but maintenance is a maintainer problem, not the consumer or user of the package, at least according to the average user of open source nowadays. One of two things are come out of this: either the wheels start falling off once the community can no longer maintain this fractured tooling as you point out, or companies are going to pick up the slack and start stewarding it (likely looking for opportunities to capture tooling and profit along the way).

      Neither outcome looks particularly appealing.

      • By NewsaHackO 2026-02-1816:28

        Yes, this just sounds like the run-of-the-mill specialization issue that is affecting every industry (and has been affecting every industry before AI). Web devs learn Javascript/Typescript/frameworks, "middleware" developers learn Rust/Go/C++/etc. to build the web development frameworks, lower-level devs build that, etc. There shouldn’t be a strict need for someone who wants to make websites or web technology to learn Rust or Go unless they want to break into web framework development or WASM stuff. But again, this is just over-specialization that has been happening since forever (or at least since the Industrial revolution).

    • By dcre 2026-02-1822:101 reply

      I look at it and don't really have an issue with it. I have been using tsc, vite, eslint, and prettier for years. I am in the process of switching my projects to tsgo (which will soon be tsc anyway), oxlint, and oxfmt. It's not a big deal and it's well worth the 10x speed increase. It would be nice if there was one toolchain to rule them all, but that is just not the world we live in.

      • By philipwhiuk 2026-02-1823:101 reply

        How do you plan to track CVEs flagged on tsgo's native dependencies.

        • By dcre 2026-02-190:06

          I only use it for typechecking locally and in CI. I don’t have it generating code. Of course, what is generating my code is esbuild and soon Rolldown, so same issue maybe. If CVEs in tsgo’s deps are a big risk to run locally, I would say I have much bigger problems than that — a hundred programs I run on my machine have this problem.

    • By TheAlexLichter 2026-02-1822:283 reply

      The good part is that the new tools do replace the old ones, while being compatible. The pattern is:

      * Rolldown is compatible to Rollup's API and can use most Rollup plugins

      * Oxlint supports JS plugins and is ESLint compatibel (can run ESLint rules easily)

      * Oxfmt plans to support Prettier plugins, in turn using the power of the ecosystem

      * and so on...

      So you get better performance and can still work with your favorite plugins and extend tools "as before".

      Regarding the "mix of technology" or tooling fatigue: I get that. We have to install a lot of tools, even for a simple application. This is where Vite+[0] will shine, bringing the modern and powerful tools together, making them even easier to adopt and reducing the divide in the ecosystem.

      [0] https://viteplus.dev/

      • By notnullorvoid 2026-02-193:031 reply

        As far as I'm aware oxlint only supports plugins for non type aware rules, and type aware rules themselves aren't fully stable because it relies on a fork of tsgo.

        • By TheAlexLichter 2026-02-198:13

          That is correct, every rule with a custom parser (e.g. vue/svelte/astro tempaltes) and also type-aware rules can't be used as JS plugin.

          Type-aware rule are indeed not marked as stable but work like a charm. tsgolint is indeed tsgo + shims + some works, but that won't change soon as tsgo won't have a JS API for a while.

      • By conartist6 2026-02-190:441 reply

        So you really think everyone in JS should have to learn Rust or else be excluded from sharing in the ownership of their critical infra..?

        • By TheAlexLichter 2026-02-197:381 reply

          1) This is not what I said, no

          2) With AI, languages and syntax matters even less nowadays.

          3) There have been a good amount of contributors (e.g. for Oxc) that came out the JS world, so it isn't impossible

          4) Realistically, the avg. web dev does not contribute to tooling internals, maximum custom rules or similar. The concepts are a bigger "hurdle" than the lang.

          • By conartist6 2026-02-1911:101 reply

            That still leaves you admitting that only a small fraction of the served community can really contribute. You'll need to keep all the best benefits of your work for the Plus users or else there would be no reason to buy Plus and no way to keep paying the few to do all the work for the many.

            You're stuck telling people what they can't have (and shouldn't want) while I'm now in a position to just give people what they want. I admire the people who work there, but you need a new business model and fast because I am unequivocally going to collapse your current one.

            • By TheAlexLichter 2026-02-1911:591 reply

              > That still leaves you admitting that only a small fraction of the served community can really contribute.

              Not really (see above).

              > You'll need to keep all the best benefits of your work for the Plus users or else there would be no reason to buy Plus and no way to keep paying the few to do all the work for the many.

              No, won't happen that way.

              > You're stuck telling people what they can't have (and shouldn't want) while I'm now in a position to just give people what they want.

              Didn't see any software of yours yet, only big talk so far sadly! Besides that, VoidZero will also be in a position to just give people what they want

              • By conartist6 2026-02-1912:24

                VoidZero is betting I'm just talk, ok, that's fair. But the way you say it makes me think that your evaluation of me is mostly on hearsay, because if you had actually tried to find out how serious I am I suspect you'd be less flippant. You're welcome in the Discord!

      • By lelandfe 2026-02-1822:382 reply

        e: ahhh frick this is just stupid AI spam for this dude’s project.

        Supports… some ESLint rules. It is not “easy” to add support to Oxlint for the rules it does not.

        The projects at my work that “switched” to it now use both Eslint and Oxlint. It sucks, but at least a subset of errors are caught much faster.

        • By dcre 2026-02-190:101 reply

          Vite+ is not “this dude’s project”, it’s made by the team that makes all the tools discussed in this article.

        • By TheAlexLichter 2026-02-190:012 reply

          Yeah, no. Real human here.

          Oxlint does support core rules out of the box but has support for JS plugins[0] as mentioned. If you don't rely on a custom parser (so svelte or vue component for example) things just work. Even react compiler rules[1].

          [0] https://oxc.rs/docs/guide/usage/linter/js-plugins.html [1] https://github.com/TheAlexLichter/oxlint-react-compiler-rule...

          • By lelandfe 2026-02-1913:17

            Definitely read AI tonality into the earlier comment, noticed it didn't call out your relationship to it, then saw that you had a comment history plugging it, and made assumptions.

            My apologies. I'll follow through to the links next time.

          • By conartist6 2026-02-1911:381 reply

            So as long as you only need the pre-installed software it's a great device eh. I'm the PC to your game console here. Parser extension? Piece of cake for us. Heck just to showboat we actually extended our es6 parser from our es3 parser, and then implemented each later standard as an extension of the earlier one. We're able to run parsers for pretty much any programming language, and making them super easy to write. We can do cross-language transforms with ease. We can be our own system of version control! We're going to be a real threat to GitHub. VoidZero is not even trying to do this stuff. Your vision is just so... small.

            • By TheAlexLichter 2026-02-1912:00

              As said in another comment: Curious to see what you are coming up! Talk is cheap

    • By riskable 2026-02-1814:371 reply

      > All I really see is an explosion of complexity.

      I thought this was the point of all development in the JavaScript/web ecosystem?

    • By cod1r 2026-02-1815:32

      It's definitely an explosion of complexity but also something that AI can help manage. So :shrug: ...

      Based on current trends, I don't think people care about knowing how all the parts work (even before these powerful LLMs came along) as long as the job gets done and things get shipped and it mostly works.

  • By fsmedberg 2026-02-1814:164 reply

    I'm very surprised the article doesn't mention Bun. Bun is significantly faster than Vite & Rolldown, if it's simply speed one is aiming for. More importantly Bun allows for simplicity. Install Bun, you get Bundler included and TypeScript just works, and it's blazing fast.

    • By yurishimo 2026-02-1815:021 reply

      IMO Bun and Vite are best suited for slightly different things. Not to say that there isn't a lot of overlap, but if you don't need many of the features Bun provides, it can be a bit overkill.

      Personally, I write a lot of Vue, so using a "first party" environment has a lot of advantages for me. Perhaps if you are a React developer, the swap might be even more straightforward.

      I also think it's important to take into consideration the other two packages mentioned in this post (oxlint & oxfmt) because they are first class citizens in Vite (and soon to be Vite+). Bun might be a _technically_ faster dev server, but if your other tools are still slow, that might be a moot point.

      Also, Typescript also "just works" in Vite as well. I have a project on work that is using `.ts` files without even an `tsconfig` file in the project.

      https://vite.dev/guide/features#typescript

      • By squidsoup 2026-02-1822:03

        Worth mentioning that both oxfmt/oxc are in alpha. I would put money on them replacing prettier and eslint, but they're not ready for production yet.

    • By kevinfiol 2026-02-1815:18

      It's been a while since I've tried it, but post-1.0 release of Bun still seemed like beta software and I would get all sorts of hard to understand errors while building a simple CRUD app. My impression from the project is the maintainers were adding so many features that they were spread too thin. Hopefully it's a little more stable now.

    • By dcre 2026-02-1822:07

      Bun and Vite are not really analogous. Bun includes features that overlap with Vite but Vite does a lot more. (It goes without saying that Bun also does things Vite doesn't do because Bun is a whole JS runtime.)

    • By canadiantim 2026-02-1814:242 reply

      Bun can replace vite?

      • By netghost 2026-02-1815:57

        Bun ships with lots of tools built in. It has support for bundling js, html, etc for the browser.

        I suspect that if you want the best results or to hit all the edge cases you'd still want vite, but bun probably covers most needs.

      • By TheAlexLichter 2026-02-197:39

        Not really.

  • By gaoshan 2026-02-1815:032 reply

    This smells of "I like to solve puzzles and fiddle with things" and reminds of hours spent satisfyingly tweaking my very specific and custom setups for various things technical.

    I, too, like to fiddle with optimizations and tool configuration puzzles but I need to get things done and get them done now. It doesn't seem fast, it seems cumbersome and inconsistent.

    • By ssgodderidge 2026-02-1815:51

      > It doesn't seem fast, it seems cumbersome and inconsistent

      I think the point of this project is to provide an opinionated set of templates aimed at shipping instead of tinkering, right? "Don't tinker with the backend frameworks, just use this and focus on building the business logic."

    • By conradkay 2026-02-1817:49

      It seems like all you have to do is paste 2-3 prompts

HackerNews