uv: An extremely fast Python package and project manager, written in Rust

2025-06-2316:27690312github.com

An extremely fast Python package and project manager, written in Rust. - astral-sh/uv

uv image image image Actions status Discord

An extremely fast Python package and project manager, written in Rust.

Shows a bar chart with benchmark results.

Installing Trio's dependencies with a warm cache.

uv is backed by Astral, the creators of Ruff.

Install uv with our standalone installers:

# On macOS and Linux.
curl -LsSf https://astral.sh/uv/install.sh | sh
# On Windows.
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"

Or, from PyPI:

# With pip.
pip install uv
# Or pipx.
pipx install uv

If installed via the standalone installer, uv can update itself to the latest version:

See the installation documentation for details and alternative installation methods.

uv's documentation is available at docs.astral.sh/uv.

Additionally, the command line reference documentation can be viewed with uv help.

uv manages project dependencies and environments, with support for lockfiles, workspaces, and more, similar to rye or poetry:

$ uv init example
Initialized project `example` at `/home/user/example` $ cd example $ uv add ruff
Creating virtual environment at: .venv
Resolved 2 packages in 170ms
 Built example @ file:///home/user/example
Prepared 2 packages in 627ms
Installed 2 packages in 1ms
 + example==0.1.0 (from file:///home/user/example)
 + ruff==0.5.0 $ uv run ruff check
All checks passed! $ uv lock
Resolved 2 packages in 0.33ms $ uv sync
Resolved 2 packages in 0.70ms
Audited 1 package in 0.02ms

See the project documentation to get started.

uv also supports building and publishing projects, even if they're not managed with uv. See the publish guide to learn more.

uv manages dependencies and environments for single-file scripts.

Create a new script and add inline metadata declaring its dependencies:

$ echo 'import requests; print(requests.get("https://astral.sh"))' > example.py $ uv add --script example.py requests
Updated `example.py`

Then, run the script in an isolated virtual environment:

$ uv run example.py
Reading inline script metadata from: example.py
Installed 5 packages in 12ms
<Response [200]>

See the scripts documentation to get started.

uv executes and installs command-line tools provided by Python packages, similar to pipx.

Run a tool in an ephemeral environment using uvx (an alias for uv tool run):

$ uvx pycowsay 'hello world!'
Resolved 1 package in 167ms
Installed 1 package in 9ms
 + pycowsay==0.0.0.2
 """  ------------
< hello world! >
 ------------
 \ ^__^
 \ (oo)\_______
 (__)\ )\/\
 ||----w |
 || ||

Install a tool with uv tool install:

$ uv tool install ruff
Resolved 1 package in 6ms
Installed 1 package in 2ms
 + ruff==0.5.0
Installed 1 executable: ruff $ ruff --version
ruff 0.5.0

See the tools documentation to get started.

uv installs Python and allows quickly switching between versions.

Install multiple Python versions:

$ uv python install 3.10 3.11 3.12
Searching for Python versions matching: Python 3.10
Searching for Python versions matching: Python 3.11
Searching for Python versions matching: Python 3.12
Installed 3 versions in 3.42s
 + cpython-3.10.14-macos-aarch64-none
 + cpython-3.11.9-macos-aarch64-none
 + cpython-3.12.4-macos-aarch64-none

Download Python versions as needed:

$ uv venv --python 3.12.0
Using Python 3.12.0
Creating virtual environment at: .venv
Activate with: source .venv/bin/activate $ uv run --python pypy@3.8 -- python --version
Python 3.8.16 (a9dbdca6fc3286b0addd2240f11d97d8e8de187a, Dec 29 2022, 11:45:30)
[PyPy 7.3.11 with GCC Apple LLVM 13.1.6 (clang-1316.0.21.2.5)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>>>

Use a specific Python version in the current directory:

$ uv python pin 3.11
Pinned `.python-version` to `3.11`

See the Python installation documentation to get started.

uv provides a drop-in replacement for common pip, pip-tools, and virtualenv commands.

uv extends their interfaces with advanced features, such as dependency version overrides, platform-independent resolutions, reproducible resolutions, alternative resolution strategies, and more.

Migrate to uv without changing your existing workflows — and experience a 10-100x speedup — with the uv pip interface.

Compile requirements into a platform-independent requirements file:

$ uv pip compile docs/requirements.in \
 --universal \
 --output-file docs/requirements.txt
Resolved 43 packages in 12ms

Create a virtual environment:

$ uv venv
Using Python 3.12.3
Creating virtual environment at: .venv
Activate with: source .venv/bin/activate

Install the locked requirements:

$ uv pip sync docs/requirements.txt
Resolved 43 packages in 11ms
Installed 43 packages in 208ms
 + babel==2.15.0
 + black==24.4.2
 + certifi==2024.7.4
 ...

See the pip interface documentation to get started.

See uv's platform support document.

See uv's versioning policy document.

We are passionate about supporting contributors of all levels of experience and would love to see you get involved in the project. See the contributing guide to get started.

It's pronounced as "you - vee" (/juː viː/)

Just "uv", please. See the style guide for details.

uv's dependency resolver uses PubGrub under the hood. We're grateful to the PubGrub maintainers, especially Jacob Finkelman, for their support.

uv's Git implementation is based on Cargo.

Some of uv's optimizations are inspired by the great work we've seen in pnpm, Orogene, and Bun. We've also learned a lot from Nathaniel J. Smith's Posy and adapted its trampoline for Windows support.

uv is licensed under either of

at your option.

Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in uv by you, as defined in the Apache-2.0 license, shall be dually licensed as above, without any additional terms or conditions.


Read the original article

Comments

  • By acheong08 2025-06-2318:5611 reply

    Just a few months back I said I would never use uv. I was already used to venv and pip. No need for another tool I thought.

    I now use uv for everything Python. The reason for the switch was a shared server where I did not have root and there were all sorts of broken packages/drivers and I needed pytorch. Nothing was working and pip was taking ages. Each user had 10GB of storage allocated and pip's cache was taking up a ton of space & not letting me change the location properly. Switched to uv and everything just worked

    If you're still holding out, really just spend 5 minutes trying it out, you won't regret it.

    • By tetha 2025-06-2320:451 reply

      For me, the big key was: uv is so much easier to explain and especially use - especially for people who sometimes script something in python and don't do this daily.

      pip + config file + venv requires you to remember ~2 steps to get the right venv - create one and install stuff into it, and for each test run, script execution and such, you need to remember a weird shebang-format, or to activate the venv. And the error messages don't help. I don't think they could help, as this setup is not standardized or blessed. You just have to beat a connection of "Import Errors" to venvs into your brain.

      It's workable, but teaching this to people unfamiliar with it has reminded me how.. squirrely the whole tooling can be, for a better word.

      Now, team members need to remember "uv run", "uv add" and "uv sync". It makes the whole thing so much easier and less intimidating to them.

      • By robertlagrant 2025-06-2414:29

        This is similar to things like poetry (poetry run ..., poetry add, poetry install), but yeah. Does look nice.

    • By tomjakubowski 2025-06-2321:212 reply

      The absolute killer feature for me of uv is that it's still compatible with all of my old venv-based workflows. Just run `uv venv`.

      • By level09 2025-06-2413:261 reply

        Unfortunately uWSGI (one of the most important libraries) is fundamentally incompatible with uv. had to roll back all my apps that use custom uWSGI for that reason.

        • By iFreilicht 2025-06-2416:21

          Could you explain how? Does it do something funky in the venv that uwsgi doesn't understand?

    • By psychoslave 2025-06-2320:004 reply

      I wonder how it compares with something more green generalist like "mise", to which I migrated after using "ASDF" for some time.

      • By codethief 2025-06-2321:44

        Similarly to the sibling I also use both. I let mise manage my uv version (and other tools) and let uv handle Python + PyPI Packages for me. Works great!

        There's also some additional integration which I haven't tried yet: https://mise.jdx.dev/mise-cookbook/python.html#mise-uv

      • By varikin 2025-06-241:45

        Think of uv more as like npm or other thing like that. The new Python pyproject.toml is similar package.json. It defines the project description, list of dependencies, and other hooks. Uv is a package/project tool using pyproject.toml. It is easy to manage dependencies, build and publish to PyPi, add hooks to run tests, linters, or whatever, again much like package.json. It also manages the virtualenv automatically, though you can manage it yourself.

      • By elbear 2025-06-245:32

        Thanks for mentioning mise. I'm more interested in it for the task running feature. I couldn't figure out how to have shell scripts in Justfile, so I gave up on it.

      • By wrboyce 2025-06-2320:061 reply

        I use both! uv installed globally with mise, and uv tools can then be managed via “miss use -g pipx:foo”.

        • By icedchai 2025-06-2322:50

          Same! I recently set up some dev environments with mise. My old ones are still using poetry, the new ones have uv. uv is incredibly snappy. It's like night and day!

    • By yjftsjthsd-h 2025-06-2320:063 reply

      > Each user had 10GB of storage allocated and pip's cache was taking up a ton of space & not letting me change the location properly. Switched to uv and everything just worked

      Is it better about storage use? (And if so, how? Is it just good at sharing what can be shared?)

      • By fsh 2025-06-2320:121 reply

        uv hardlinks identical packages, so adding virtual envs takes up very little space.

        • By snerbles 2025-06-2320:321 reply

          Unless you cross mount points, which uv will helpfully warn about.

          • By codethief 2025-06-2321:42

            In those situations, have you had a any luck using UV_LINK_MODE=symlink? I eventually had to resort to `copy` mode because it seemed the folder names (hashes) uv created in the package cache were not fully deterministic. So sometimes my cache volume would change and my Docker build would break. :\

      • By acheong08 2025-06-2321:581 reply

        Both pip and uv cache packages to ~/.cache. Uv lets you change it to /tmp and symlink instead of copying

        • By esseph 2025-06-246:10

          Note: /tmp will take awhile to get rid of, but it's definitely on the chopping block.

          I'd avoid workflows that lean on it, if anything else for security's sake.

      • By kissgyorgy 2025-06-2321:03

        There is a global cache for all installed packages in the user home cache dir.

    • By PeterStuer 2025-06-248:12

      Just made the switch myself. Was far easier and smoother than I expected.

    • By ed_elliott_asc 2025-06-2320:57

      I came here to comment that I don’t see any reason to bother - that’s for the comment, I will try it now!

    • By bmitc 2025-06-2322:521 reply

      What has been holding me back on uv is my experience with Ruff. Ruff claims "Drop-in parity with Flake8, isort, and Black", but that is simply not true. At least for isort, Ruff only re-implemented what they wanted and then ask you to use Ruff to call out to the old isort tool if there's a feature or setting that's missing in the Ruff re-implementation. So what's the point? Ruff just partially re-implemented many existing different tools and added some new ones. So using Ruff actually increases the amount of tools, yet again, you're having to use because of this and it also not doing everything that Pylint does.

      For moving to uv, I haven't heard a good story for what uv provides over Poetry rather than "is fast". The only unique thing that I am currently aware of is that uv can install Python itself, which gets rid of tools like Pyenv. I'm interested because of that, but "is fast" isn't enough of a reason.

      • By Hasnep 2025-06-2323:39

        My experience is that ruff reimplemented 99% of the most popular features of black, isort, flake8, pylint, etc. and then added 10000% more features on top, that feels like a fair tradeoff to me.

        I've converted multiple large Python codebases to ruff, and each time I just configure ruff as close to the previous tools as possible, then reformat the entire codebase with ruff and remove all the previous tools. The speed increase when linting alone is worth the minor formatting changes to me.

        If you really insist on keeping isort's sorting then you could at least replace black and pylint, which would reduce the total number of tools by one.

    • By mistrial9 2025-06-2322:23

      similar story recently with an experimental repo that starts with "its so easy, just $uv a b c" .. under the hood it implies a lot of redundancies? but true enough it worked fine and trouble-free too, on a standard GNU-Debian-Ubuntu host

    • By _vya7 2025-06-2320:513 reply

      I remember using pip and venv back in like 2009. Last time I checked, maybe 5 or 10 years ago, the recommendation of the community was generally to just use Docker instead of all these tools. Did that not catch on?

      • By unclad5968 2025-06-2321:081 reply

        The advice seems to change every year. For a while it was venv, then pipenv, poetry, docker, and now uv. Maybe the ecosystem will settle on that but who knows.

        • By AlphaSite 2025-06-240:33

          I mean docker is orthogonal to package manager. It makes it easier to deploy but none of the other thing also have managers do are relevant.

      • By lmm 2025-06-241:25

        Docker was always a workaround to Python not having a non-awful dependency manager. uv is that non-awful dependency manager, and I expect in the long term it will reduce the use of Docker.

      • By dagw 2025-06-2411:20

        Docker solves a different problem. Docker is a way to basically ship your whole OS off to another machine. You still have to have a way to install the right version of python and all the python libraries you need inside the Docker container, and uv is great for this.

        Secondly Docker only solves a subset of problems. It's fine if you're developing a server that you will be deploying somewhere. It's inconvenient if you're developing an end user application, and it's completely useless if you're developing a library you want people to be able to install.

    • By oofbey 2025-06-2321:523 reply

      I love uv. The one gotcha I'll warn people about is: don't touch uvx. I've lost an embarrassing number of hours or days trying to figure out why nothing works properly or makes sense when I tried to run things with uvx. I guess I understand why it's there, but I think it's a built-in foot-gun and not well documented. But if you stay away from it, things work great.

      • By nirv 2025-06-2323:31

        I suppose you didn't accompany the command `uvx` with the necessary `--with=` arguments[1] for each dependency.

        [1] https://docs.astral.sh/uv/guides/tools/#commands-with-plugin...

      • By jsmeaton 2025-06-2322:53

        What issues are you having with uvx? It replaces tools like pipx that set up implicit venvs to run specific tools. Works great for me.

      • By maleldil 2025-06-2322:46

        uvx is fine. I use it to run executable packages all the time. What is your problem with it?

    • By espdev 2025-06-2319:274 reply

      > Just a few months back I said I would never use uv. I was already used to venv and pip. No need for another tool I thought

      Really? :)

      requirements.txt is just hell and torture. If you've ever used modern project/dependency management tools like uv, Poetry, PDM, you'll never go back to pip+requirements.txt. It's crazy and a mess.

      uv is super fast and a great tool, but still has roughnesses and bugs.

      • By aequitas 2025-06-2319:48

        Pip-tools+requirements.txt helped me survive the past few years. I also never thought I needed uv, but after all the talk about it I gave it a spin and never want back. It’s just so blazing fast en convenient.

      • By kortex 2025-06-2320:471 reply

        We use uv to compile requirements.txt from pyproject.toml to get the locked versions.

            # Makefile
            compile-deps:
             uv pip compile pyproject.toml -o requirements.txt
            
            compile-deps-dev:
             uv pip compile --extra=dev pyproject.toml -o requirements.dev.txt

        • By espdev 2025-06-2322:332 reply

          What for? Support legacy CI/CD pipelines or something like that? uv.lock already contains locked versions of all dependencies plus a lot of other needed metadata.

          • By halfcat 2025-06-2323:32

            > What for? Support legacy CI/CD pipelines

            Yes. Azure, for instance, looks for requirements.txt if you deploy a web app to Azure App Service.

            If you’re doing a code-based deployment, it works really well. Push to GitHub, it deploys.

            You can of course do a container-based deployment to Azure App Service and I’d assume that will work with uv.

          • By esseph 2025-06-246:12

            "legacy CI/CD pipelines"

            Damn I'm getting old

      • By _Algernon_ 2025-06-2319:58

        pip also works with pyproject.toml. Sticking with requirements.txt is a self-imposed constraint.

      • By pinoy420 2025-06-2319:31

        [dead]

  • By polivier 2025-06-2318:174 reply

    The first time I used `uv`, I was sure that I had made a mistake or typed something wrong because the process finished so much more quickly than anything I had ever experienced with `pip`.

    • By tux3 2025-06-2319:141 reply

      I've sometimes had uv take up to 200ms to install packages, so you could feel a slight delay between pressing enter and the next shell prompt

      You don't have that problem with Poetry. You go make a cup of coffee for a couple minutes, and it's usually done when you come back.

      • By Numerlor 2025-06-2320:091 reply

        It's funny when the exact same thing was probably said about pipenv and poetry

        • By icedchai 2025-06-2322:51

          I've had poetry sit there for minutes resolving dependencies at a previous company. I thought something was broken... it probably was... but it did eventually complete.

    • By baby 2025-06-2318:181 reply

      Same here lol! The experience is so smooth it doesn't feel like python

      • By johnfn 2025-06-2319:38

        That makes sense, because it's Rust. :)

    • By augustflanagan 2025-06-2318:37

      I just had this same experience last week, and was certain it wasn’t working correctly as well. I’m a convert.

    • By nialse 2025-06-2318:28

      Likewise. I was skeptical, then I tried it and won’t go back.

  • By theLiminator 2025-06-2317:437 reply

    uv and ruff are a great counterexample to all those people who say "never reinvent the wheel". Don't ever do it just for the sake of doing it, but if you have focused goals you can sometimes produce a product that's an order of magnitude better.

    • By CrendKing 2025-06-2321:311 reply

      I believe most of the time this phrase is said to an inexperienced artisan who has no idea how the current system works, what's the shortcoming of it, and how to improve upon it. Think of an undergraduate student who tries to solve the Goldbach conjecture. Usually what ended up is either he fails to reinvent the wheel, or reinvent the exact same wheel, which has no value. The phrase certainly does not apply to professionals.

      • By dwattttt 2025-06-244:50

        Even then, you know what's a good way to learn about how the current system works etc, maybe even the best way? I've got many failed projects behind me, and 0 regrets.

    • By eviks 2025-06-2317:582 reply

      They didn't reinvent the wheel, "just" replaced all the wood with more durable materials to make it handle rotation at 10 times the speed

      • By doug_durham 2025-06-2321:54

        A big part of the "magic" is that there is a team of paid professionals maintaining and improving it. That's more important than it being written in Rust. If uv were forked it would devolve to the level of pip over time.

      • By socalgal2 2025-06-2318:487 reply

        I'd be curious to know exactly what changed. Python -> Rust won't make network downloads faster nor file I/O faster. My naive guess is that all the speed comes from choosing better algorithms and/or parallelizing things. Not from Python vs Rust (though if it's hard to parallelize in Python and easy in rust that would certainly make a difference)

        • By ekidd 2025-06-2319:30

          I've translated code from Ruby to Python, and other code from Rust to Python.

          Rust's speed advantages typically come from one of a few places:

          1. Fast start-up times, thanks to pre-compiled native binaries.

          2. Large amounts of CPU-level concurrency with many fewer bugs. I'm willing to do ridiculous threading tricks in Rust I wouldn't dare try in C++.

          3. Much lower levels of malloc/free in Rust compared to some high-level languages, especially if you're willing to work a little for it. Calling malloc in a multithreaded system is basically like watching the Millennium Falcon's hyperdrive fail. Also, Rust encourages abusing the stack to a ridiculous degree, which further reduces allocation. It's hard to "invisibly" call malloc in Rust, even compared to a language like C++.

          4. For better or worse, Rust exposes a lot of the machinery behind memory layout and passing references. This means there's a permanent "Rust tax" where you ask yourself "Do I pass this by value or reference? Who owns this, and who just borrows is?" But the payoff for that work is good memory locality.

          So if you put in a modest amount of effort, it's fairly easy to make Rust run surprisingly fast. It's not an absolute guarantee, and there are couple of traps for the unwary (like accidentally forgetting to buffer I/O, or benchmarking debug binaries).

        • By the8472 2025-06-2319:05

          NVMe hungers, keeping it fed is hard work. Doing some serial read, decompress, checksum, write loop will leave if starved (QD<1) whenever you're doing anything but the last step. Disk IO isn't async unless you use io_uring (well ok, writeback caches can be). So threads are almost a must to keep NVMe busy. Conversely, waiting for blocking IO (e.g. directory enumeration) will keep your CPU starved. Here too the answer is more threads.

        • By captnswing 2025-06-2320:471 reply

          Extremely interesting presentation from Charlie Marsh about all the optimizations https://youtu.be/gSKTfG1GXYQ?si=CTc2EwQptMmKxBwG

          • By socalgal2 2025-06-241:20

            Thanks. So from the video the biggest wins were

            1. they way get the metadata for a package.

            packages are in zip files. zip files have their TOC at the end. So, instead of downloading the entire zip they just get the end of the file, read the TOC, then from that download just the metadata part

            I've written that code before for my own projects.

            2. They cache the results of packages unzipped and then link into your environment

            This means there's no files being copied on the 2nd install. Just links.

            Both of those are huge time wins that would be possible in any language.

            3. They store their metadata as a memory dump

            So, on loading there is nothing to parse.

            Admittedly this is hard (impossible?) in many languages. Certainly not possible in Python and JavaScript. You could load binary data but it won't be useful without copying it into native numbers/strings/ints/floats/doubles etc...

            I've done this in game engines to reduce load times in C/C++ and to save memory.

            It'd be interesting to write some benchmarks for the first 2. The 3rd is a win but I suspect the first 2 are 95% of the speedup.

        • By jerpint 2025-06-2319:11

          From just my observations they basically parallelized the install sequence instead of having it be sequential (among many other optimizations most likely)

        • By jerf 2025-06-2321:052 reply

          It became a bit of a meme, especially in the web development space, that all programs are always waiting on external resources like networks, databases, disks, etc., and so scripting languages being slower than other languages doesn't matter and they'll always be as fast as non-scripting languages.

          Even on a single core, this turns out to be simply false. It isn't that hard to either A: be doing enough actual computation that faster languages are in fact perceptibly faster, even, yes, in a web page handler or other such supposedly-blocked computation or B: without realizing it, have stacked up so many expensive abstractions on top of each other in your scripting language that you're multiplying the off-the-top 40x-ish slower with another set of multiplicative penalties that can take you into effectively arbitrarily-slower computations.

          If you're never profiled a mature scripting language program, it's worth your time. Especially if nobody on your team has ever profiled it before. It can be an eye-opener.

          Then it turns out that for historical path reasons, dynamic scripting languages are also really bad at multithreading and using multiple cores, and if you can write a program that can leverage that you can just blow away the dynamic scripting languages. It's not even hard... it pretty much just happens.

          (I say historical path reasons because I don't think an inability to multithread is intrinsic to the dynamic scripting languages. It's just they all came out in an era when they could assume single core, it got ingrained into them for a couple of decades, and the reality is, it's never going to come fully out. I think someone could build a new dynamic language that threaded properly from the beginning, though.)

          You really can see big gains just taking a dynamic scripting language program and turning it into a compiled language with no major changes to the algorithms. The 40x-ish penalty off the top is often in practice an underestimate, because that number is generally from highly optimized benchmarks in which the dynamic language implementation is highly tuned to avoid expensive operations; real code that takes advantage of all the conveniences and indirection and such can have even larger gaps.

          This is not to say that dynamic scripting languages are bad. Performance is not the only thing that matters. They are quite obviously fast enough for a wide variety of tasks, by the strongest possible proof of that statement. That said, I think it is the case that there are a lot of programmers who have no idea how much performance they are losing in dynamic scripting languages, which can result in suboptimal engineering decisions. It is completely possible to replace a dynamic scripting language program with a compiled one and possibly see 100x+ performance improvements on very realistic code, before adding in multithreading. It is hard for that not to manifest in some sort of user experience improvement. My pitch here is not to give up dynamic scripting languages, but to have a more realistic view of the programming language landscape as a whole.

          • By RhysU 2025-06-2322:083 reply

            > Then it turns out that for historical path reasons, dynamic scripting languages are also really bad at multithreading and using multiple cores...

            What would a dynamic scripting language look like that wasn't subject to this limitation? Any examples? I don't know of contenders in this design space--- I am not up on it.

            • By jerf 2025-06-2413:40

              It would look pretty much the same. It would just have been written to be multithreaded from the beginning, and lack the long list of restrictions and caveats and "but it doesn't work with our C extensions" and such. There wouldn't be a dozen major libraries trying to solve the problem (which, contrary to many people's intuition, is often a sign that a language lacks a good solution). This is part of why I say there's no fundamental reason this can't be done, it's just a historical accident.

            • By Tuna-Fish 2025-06-2323:39

              The big difference from Python is probably having to use a real tracing GC instead of automatic reference counting. For a single-threaded program, refcounts are beneficial in multiple ways, being fairly cheap, having a smooth performance profile, maintaining low resident set size, and providing deterministic freeing.

              But because of the way cache coherency for shared, mutated memory works, parallel refcounting is slow as molasses and will always remain so.

              I think Ruby has always used a tracing GC, but it also still has a GIL for some reason?

            • By dgb23 2025-06-246:53

              There are dynamic languages that were built with concurrency in mind like Clojure. It’s also a surprisingly fast language considering it’s both dynamic and functional.

          • By socalgal2 2025-06-2321:241 reply

            I'm not trying to suggest that you can't do faster computation in a lower-level language. But, a package manager doesn't do much computation. It mostly downloads, decompresses, and writes files. Yes, it has to solve constraints but that's not a bottleneck given most projects have at most a few 100 dependencies and not millions.

            I don't know python but in JavaScript, triggering 1000 downloads in parallel is trivial. Decompressing them, like in python, is calling out to some native function. Decompressing them in parallel in JS would also be trivial (no idea about python). Writing them in parallel is also trivial.

            • By jerf 2025-06-2413:441 reply

              Congratulations! You have proved that it is impossible for uv to be way, way faster than Python-based package managers!

              ....

              Unfortunately, there seems to be a problem here.

              When reality and theory conflict, reality wins.

              It sounds like you've drunk the same Kool-Aide I was referring to in my post. It's not true. When you're playing with 50x-100x slowdowns, if not more, it's really quite easy to run into user-perceptible slowdowns. A lot of engineers grotesquely underestimate how slow these languages are. I suspect it may be getting worse over time due to evaporative cooling, as engineers who do understand it also tend to have one reason or another to leave the language community at some point, and I believe (though I can not prove) that as a result the dynamic scripting language communities are actually getting worse and worse at realizing how slow their languages are. They're really quite slow.

              • By socalgal2 2025-06-2417:42

                You seem to be implying rust = fast, the end. I'm implying algorithms and design choices = fast. Those decisions generally (though not always) are far more effective at speed than language choice.

                I watched the video linked above on uv. They went over the optimizations. The big wins had nothing to do with rust and everything to do with design/algo choices.

                You could have also done without the insults. You have no idea who I am and my experiences. I've shipped several AAA games written in C/C++ and assembly. I know how to optimize. I also know how dynamic languages work. I also know when people are making up bullshit about "it's fast because it's in rust!". No, that is not why it's fast.

        • By physicsguy 2025-06-2319:49

          The package resolution is a big part of it, it's effectively a constraint solver. I.e. if package A requires package B constrained between version 1.0 < X <= 2.X and Package B requires package C between... and so on and so on.

          Conda rewrote their package resolver for similar reasons

        • By globular-toast 2025-06-2320:15

          There is a talk about it from one of the authors here: https://www.youtube.com/watch?v=gSKTfG1GXYQ

          tl;dw Rust, a fast SAT solver, micro-optimisation of key components, caching, and hardlinks/CoW.

    • By 0cf8612b2e1e 2025-06-2318:095 reply

      The history of Python package management is clear that everyone thinks they can do a better job than the status quo.

      • By psunavy03 2025-06-2318:101 reply

        In this case, they were right.

        • By dwattttt 2025-06-245:34

          I would say in many cases they were right; the history of Python package management is littered with winners as well as losers.

      • By lmm 2025-06-241:291 reply

        Python package management was notoriously awful. The problem wasn't that people were trying to do things better, it was that they weren't; every new Python dependency management tool just repeated the mistakes of all the previous Python dependency management tools. uv is the first one to break the cycle (and it's probably not a coincidence that it's the first one to not be written in Python).

        • By nonethewiser 2025-06-2414:56

          Poetry broke the cycle. Unified toolchain, lock file, single configuration file, full dependency graph, dev dependencies. uv is faster which is great but Poetry was a huge step in the right direction and still a good tool.

      • By nickelpro 2025-06-2319:162 reply

        uv is purely a performance improvement, it changes nothing about the mechanics of Python environment management or packaging.

        The improvements came from lots of work from the entire python build system ecosystem and consensus building.

        • By 0cf8612b2e1e 2025-06-2319:361 reply

          Disagree in that uv makes switching out the underlying interpreter so straightforward. Becomes trivial to swap from say 3.11 to 3.12. The pybi idea.

          Sure, other tools could handle the situation, but being baked into the tooling makes it much easier to bootstrap different configurations.

          • By nickelpro 2025-06-2319:371 reply

            Yes, it's faster and better than pyenv, but the mechanism it's using (virtual environments) is not a uv invention.

            uv does the Python ecosystem better than any other tool, but it's still the standard Python ecosystem as defined in the relevant PEPs.

            • By pityJuke 2025-06-2320:261 reply

              Are the lock files standardised, or a uv-specific thing?

              • By nickelpro 2025-06-2321:06

                uv has both a uv-specific implementation, and support for standard PEP 751 lockfiles

        • By globular-toast 2025-06-2320:201 reply

          Actually not true. One of the main differences with uv is you don't have to think about venvs any more. There's a talk about it from one of the authors at a recent PyCon here: https://www.youtube.com/watch?v=CV8KRvWKYDw (not the same talk I linked elsewhere in the thread).

          • By nickelpro 2025-06-2321:103 reply

            How do you think uv works?

            It creates a venv. Note were talking about the concept of a virtual environment here, PEP 405, not the Python module "venv".

            • By blitzar 2025-06-247:53

              > How do you think uv works?

              Dont know, dont care. It thinks about these things not me.

            • By lmm 2025-06-241:271 reply

              The implementation details don't matter. uv might follow PEP 405 but it could work just as well without doing so. The point is that it doesn't give you the bunch of extra footguns that any other Python package management does.

              • By nickelpro 2025-06-242:341 reply

                It matters immensely that it follows PEP 405, it makes uv the implementation detail. You can swap out uv for any other project management tool or build frontend and change nothing needs to change about the development environment.

                This is the entire purpose of the standards.

                • By lmm 2025-06-243:163 reply

                  > You can swap out uv for any other project management tool or build frontend and change nothing needs to change about the development environment.

                  > This is the entire purpose of the standards.

                  That seems to amount to saying that the purpose of the standards is to prevent progress and ensure that the mistakes of early Python project management tools are preserved forever. (Which would explain some things about the last ~25 years of Python project management I guess). The parts of uv that follow standards aren't the parts that people are excited about.

                  • By nickelpro 2025-06-2418:14

                    There are no parts of uv that don't follow standards.

                    The standards have nothing to do with the last 25 years of Python project management, the most import ones (PEP 517/518) are less than 10 years old.

                  • By dagw 2025-06-2410:08

                    The parts of uv that follow standards aren't the parts that people are excited about.

                    I disagree. Had uv not followed these standards and instead gone off and done their completely own thing, it could not function as a drop in replacement for pip and venv and wouldn't have gotten anywhere near as much traction. I can use uv personally to work on projects that officially have to support pip and venv and have it all be transparent.

                  • By aragilar 2025-06-2411:43

                    uv only exists because of those standards and therefore can make assumptions that earlier tools could not.

            • By globular-toast 2025-06-246:42

              I said you don't have to think about venvs any more. It's great that we have a standard way to implement them, but this is only necessary in the first place because of the way Python is. Now we have a tool that enforces a workflow that creates virtualenvs without you having to know about them and therefore not screwing them up with ad hoc pip installs etc.

      • By akoboldfrying 2025-06-240:17

        True, but then all software is developed for this reason.

      • By henry700 2025-06-2318:57

        Of course they do, this tends to happen when the history is it being hot flaming garbage.

    • By mort96 2025-06-2318:275 reply

      Honestly "don't reinvent the wheel" makes absolutely no sense as a saying. We're not still all using wooden discs as wheels, we have invented much better wheels since the neolithic. Why shouldn't we do the same with software?

      • By simonw 2025-06-2319:041 reply

        When asked why he had invented JSON when XML already existed, Douglas Crockford said:

        The good thing about reinventing the wheel is that you can get a round one.

        https://scripting.wordpress.com/2006/12/20/scripting-news-fo...

        • By idle_zealot 2025-06-2320:372 reply

          You can get a round one. Or you can make yet another wonky shaped one to add to the collection, as ended up being the case with JSON.

          • By simonw 2025-06-2321:122 reply

            What makes JSON wonky?

            Personally the only thing I miss from it is support for binary data - you end up having to base64 binary content which is a little messy.

            • By idle_zealot 2025-06-2322:07

              Quoted keys, strict comma rules, very limited data types, are the main ones. There are a host of others if you view it through the lenses of user-read/write, and a different set of issues if you view it as a machine data interface. Trying to combine the two seems fundamentally misguided.

            • By Myrmornis 2025-06-242:10

              Lack of comments seems like a big one seeing as it's so widely used for "configuration". It's a big enough downside that VSCode and others have violated it via ad-hoc extensions of the format.

              The comma rules introduce diff noise on unrelated lines.

          • By psunavy03 2025-06-2322:121 reply

            Insert the xkcd about 15 competing standards . . .

            • By oblio 2025-06-2322:50

              Standards do die off, up to a point. XML is widely used but the last time I really had to edit it in anger working in DevOps/web/Python was a long time ago (10 years ago?).

              At this point XML is the backbone of many important technologies that many people won't use or won't use directly anymore.

              This wasn't the case circa 2010, when I doubt any dev could have really avoided XML for a bunch of years.

              I do like XML, though.

      • By haiku2077 2025-06-2318:521 reply

        Right, wheels are reinvented every few years. Compare tires of today to the ones 20 years ago and the technology and capability is very different, even though they look identical to a casual eye.

        My primary vehicle has off-road capable tires that offer as much grip as a road-only tire would have 20-25 years ago, thanks to technology allowing Michelin to reinvent what a dual-purpose tire can be!

        • By nightpool 2025-06-2320:161 reply

          > Compare tires of today to the ones 20 years ago and the technology and capability is very different, even though they look identical to a casual eye

          Can you share more about this? What has changed between tires of 2005 and 2025?

      • By sashimi-houdini 2025-06-246:49

        I also like Dan Luu's take (starting with a Joel Spolsky quote)

        “Find the dependencies — and eliminate them.” When you're working on a really, really good team with great programmers, everybody else's code, frankly, is bug-infested garbage, and nobody else knows how to ship on time.

        We had a similar attitude, although I'd say that we were a bit more humble. We didn't think that everyone else was producing garbage but, we also didn't assume that we couldn't produce something comparable to what we could buy for a tenth of the cost. From talking to folks at some competitors, there was a pretty big cultural difference between how we operated and how they operated. It simply didn't occur to them that they didn't have to buy into the standard American business logic that you should focus on your core competencies, that you can think through whether or not it makes sense to do something in-house on the merits of the particular thing instead of outsourcing your thinking to a pithy saying.[0]

        [0] https://danluu.com/nothing-works/

      • By aalimov_ 2025-06-2318:511 reply

        I always took this saying as meaning that we don’t re-invent the concept of the wheel. For example the Boring company and Tesla hoping to reinvent the concept of the bus/train.. (iirc your car goes underground on some tracks and you get to bypass traffic and not worry about steering)

        A metal wheel is still just a wheel. A faster package manager is still just a package manager.

        • By haiku2077 2025-06-2318:53

          That's not how I've ever seen it used in practice. People use it to mean "don't build a replacement for anything functional."

      • By rocqua 2025-06-240:12

        I came here to (wrongly) say that wooden disks were never used as wheels, and that ot all started with spokes. Some checking showed that, in fact, the oldest known wheels have a lot of solid disks. E.g: https://en.m.wikipedia.org/wiki/Ljubljana_Marshes_Wheel

        Hopefully this can disabuse others of similar mistaken memory.

    • By jjtheblunt 2025-06-2318:259 reply

      > an order of magnitude better

      off topic, but i wonder why that phrase gets used rather than 10x which is much shorter.

      • By BeetleB 2025-06-2319:44

        Short answer: Because the base may not be 10.

        Long answer: Because if you put a number, people expect it to be accurate. If it was 6x faster, and you said 10x, people may call you out on it.

      • By screye 2025-06-2319:00

        It's meant to signify a step change. Order of magnitude change = no amount of incremental changes would make up for it.

        In common conversation, the multiplier can vary from 2x - 10x. In context of some algorithms, order of magnitudes can be over the delta rather than absolutes. eg: an algorithms sees 1.1x improvement over the previous 10 years. A change that shows a 1.1x improvement by itself, overshadows an an order-of-magnitude more effort.

        For salaries, I've used order-of-magnitude to mean 2x. Good way to show a step change in a person's perceived value in the market.

      • By bxparks 2025-06-2318:421 reply

        I think of "an order of magnitude" as a log scale. It means somewhere between 3.16X and 31.6X.

        • By jjtheblunt 2025-06-2319:221 reply

          yeah that's what i meant with 10x, like it's +1 on the exponent, if base is 10. but i'm guessing what others are thinking, hence the question.

          • By bxparks 2025-06-243:011 reply

            The problem is that 10x appears to be a linear scale. It could mean 9.5x to 10.5x if it's supposed to have 2 significant digits. Or it could be 5x to 15x if it meant to have 1 significant digit.

      • By fkyoureadthedoc 2025-06-2318:29

        - sounds cooler

        - 10x is a meme

        - what if it's 12x better

      • By Scene_Cast2 2025-06-2318:28

        10x is too precise.

      • By bmacho 2025-06-2319:53

        Because it's not 10x?

      • By chuckadams 2025-06-2319:11

        Because "magnitude" has cool gravitas, something in how it's pronounced. And it's not meant to be precise, it just means "a whole lot more".

      • By refulgentis 2025-06-2318:411 reply

        "10x" has been cheapened / heard enough / de facto, is a more general statement than a literal interpretation would indicate. (i.e. 10x engineer. Don't hear that much around these parts these days)

        Order of magnitude faces less of that baggage, until it does :)

        • By psunavy03 2025-06-2322:12

          Would you say it faces . . . orders of magnitude less baggage?

      • By neutronicus 2025-06-2319:11

        5x faster is an order of magnitude bc of rounding

    • By bmitc 2025-06-2322:55

      Ruff is actually a good example of the danger of rewrites. They rewrote tools but not all of the parts of the tools.

    • By zzzeek 2025-06-242:00

      ruff does not support custom plugins so is useless to me

HackerNews