Hacker News new | past | comments | ask | show | jobs | submit login
Uv: Python packaging in Rust (astral.sh)
647 points by samwho 86 days ago | hide | past | favorite | 210 comments



A VC-backed pip-and-more doesn't make sense to me. It's 2024: what's the revenue model when the free money printer's on the fritz?


That was one of my first questions, but Anaconda exists https://www.anaconda.com/download/

Python is used by loads of scientists, academics, and other non software engineers. Those people need an easy way to use python.


I knew they sold something, but I am amazed to learn they have 300 employees.


Anaconda is very big in corporate envs.

If you want to have a platform that allow to manage Python on all machines, including allowed packages and version, integrated with ldap, with auditing capabilities, they are pretty much the only game in town.

And big companies want that.


Even bigger now that their distribution is integrated in Excel[1]

[1]: https://www.anaconda.com/blog/announcing-python-in-excel-nex...


I agree. Real question: Why didn't ActiveState (ActivePython) win this battle? In my corporate experience, they are invisible.


Because Continum had a loss leader with anaconda that, in 2010, was solving a ton of packaging problems.

Today I would say anaconda brings more problems than it solves (https://www.bitecode.dev/p/why-not-tell-people-to-simply-use), but at the time, we didn't have wheels for everything, and it was a life saver for anybody that wanted to use c extensions.

So anaconda became first popular because it solved a real end user problem, then it moved on to be the corporation providers because it already was well known.

It was a very good strategy.


The issue is there's c extensions, and there's c extensions. Something like cryptography is self-contained (kinda, we're sweeping a lot under the carpet with rust here), whereas something like pytorch cares much more about what environment it was built in and will run in (and then there's things like mpi4py, which you can't really distribute wheels for on PyPI). conda, by basically distributing a full userland (which has its own issues), can handle all those packages, and even more hairy ones (e.g. how would you manage the R packages you use for r2py), and because it runs on Windows (similar solutions before and after conda were tied to, or at least started with, unix-based systems), it replaced the less general ones on Windows which usually only supported a limited number of packages (e.g. ActivePython, Enthought).

PyPI distributed wheels (you can obviously build wheels however you like, but that doesn't mean they'll run on others' systems) may at some point get close to what conda does (likely by reinventing it in an ad-hoc fashion), but there's enough mindshare (especially around its target of "data science", where all the tutorials are around using conda) that I don't see it disappearing.


They were quite big in the 2000, on my bubble they faded away as .NET and Java with their IDEs came into the picture, alongside Tcl, Perl and Python improving their Windows support story.

They were never big outside Windows shops.


Why they need so much employees?


Sell build-related services to companies. Imagine GitHub Actions, but it's cleanly built into your Python tooling in some reasonable way, so it's just the natural thing to use. I think it's pretty straightforward, although we'll see whether it works out for them.


NPM comes to mind. I’m imagining private package management and team support https://www.npmjs.com/products


npm, Inc. was an unsuccessful business that got acquired for a pittance because MS can afford to throw away a few million dollars in case it turns out that having control over npm is useful. The team working on it isn't very large but I'd still be surprised if it's actually operating at a profit.


NPM did not go well, and selling to Microsoft happened after the team there fell apart. In my view some of that is leadership issues, and some of that is pressure from a struggling business.


NPM’s got Microsoft/GitHub behind it. I doubt those features bring in any serious money, given the abundance of free alternatives.


I'm curious to see what the pydantic start up will do.


I read their about page and it seems they want to make Python dev more productive. Maybe they have a lot of projects using it and are tired of the tooling/packaging BS. I could definitely see someone making billions allocate under 1% of it towards fixing that.

Improving Python is especially cheap compared to the productivity that could be unleashed. Surprised it isn't done more often. Only microsoft has shown significant interest, which is a shame. Perhaps changing.


Congrats!

> Similarly, uv does not yet generate a platform-agnostic lockfile. This matches pip-tools, but differs from Poetry and PDM, making uv a better fit for projects built around the pip and pip-tools workflows.

Do you expect to make the higher level workflow independent of requirements.txt / support a platform-agnostic lockfile? Being attached to Rye makes me think "no".

Without being platform agnostic, to me this is dead-on-arrival and unable to meet the "Cargo for Python" aim.

> uv supports alternate resolution strategies. By default, uv follows the standard Python dependency resolution strategy of preferring the latest compatible version of each package. But by passing --resolution=lowest, library authors can test their packages against the lowest-compatible version of their dependencies. (This is similar to Go's Minimal version selection.)

> uv allows for resolutions against arbitrary target Python versions. While pip and pip-tools always resolve against the currently-installed Python version (generating, e.g., a Python 3.12-compatible resolution when running under Python 3.12), uv accepts a --python-version parameter, enabling you to generate, e.g., Python 3.7-compatible resolutions even when running under newer versions.

This is great to see though!

I can understand it being a flag on these lower level, directly invoked dependency resolution operations.

While you aren't onto the higher level operations yet, I think it'd be useful to see if there is any cross-ecosystem learning we can do for my MSRV RFC: https://github.com/rust-lang/rfcs/pull/3537

How are you handling pre-releases in you resolution? Unsure how much of that is specified in PEPs. Its something that Cargo is weak in today but we're slowly improving.


Thanks Ed! Your work as always is a big source of inspiration.

> Do you expect to make the higher level workflow independent of requirements.txt / support a platform-agnostic lockfile? Being attached to Rye makes me think "no".

Yes, we absolutely do. We don't do this today, the initial scope is intentionally limited. But in the next phase of the project, we want to extend to multi-platform and multi-version resolution.

> How are you handling pre-releases in you resolution? Unsure how much of that is specified in PEPs. Its something that Cargo is weak in today but we're slowly improving.

This is something we talked with Jacob about quite a bit. Turns out (as you know) it's a very hard problem. For the initial release, we added a constraint: our default behavior is that if you want to use a pre-release, you _have_ to specify the package as a first-party dependency and use a pre-release marker in the version specifier. (We also support globally enabling and disabling pre-releases.) So, we basically don't support "transitive" pre-releases right now -- but we give you a dedicated error message if your resolution fails for that reason.


Any plans to tackle the Python version installation side of things and make it as seamless as rustup has? I've previously used `pyenv install` for this, but it would be nice to fold it into one tool.


Note that rye already handles it better then pyenv (it downloads pre built pythons instead of building from source). I assume they'll eventually they'll copy/move the functionality over.


Yeah, this is very much on our roadmap and probably one of the first things we'll tackle next.


> How are you handling pre-releases in you resolution? Unsure how much of that is specified in PEPs.

The living version of PEP 440 has a bit on how pre-releases are handled[1]. The basic version is that the installer shouldn't select them at all, unless the user explicitly indicates that they want a pre-release. Once opted into, they're ordered by their phase (alpha, beta, rc) and pre-release increment (e.g. `.beta.1 > `.alpha.2`).

[1]: https://packaging.python.org/en/latest/specifications/versio...


PyTorch doesn't work well with platform-agnostic lockfiles. It's a constant source of issues for me when using Poetry.


The vast majority of pypi packages are not PyTorch, however.


That is great. Fuzzing would be cool too - just completely randomise the versions within the claimed compatibility constraints.


How does a language ecosystem that bakes "there should be one-- and preferably only one --obvious way to do it" into the interpreter* as a statement of values end up with such a convoluted packaging story?

* Try running `python -c 'import this' | sed -n 15p`


Simple: Python is old. It predates most shiny languages by decades, and so does its packaging-trip. I mean, most modern packaging can shine today, because they could learn from the imperfections of their forerunners. Additionally, python has a very wide field to cover, far wider than most other languages, which makes things more complicate and thus people are more open for experiments, which leads to so many different solutions.


Python is old, but pip itself had its 1.0 release 8 months after Ruby's Bundler did and 6 years after Apache Maven, both of which are substantially better package managers than pip and both of which are also for 90s-era programming languages, not the new shiny ones.

Age is part of the problem, but it's not the whole problem. Ruby manages to have a package manager that everyone loves, and Java has settled on a duopoly. Meanwhile Python has someone making a serious attempt at a new package manager every few years.


pip wasn't pythons first package-manager, and obviously wasn't it just created with version 1.0. There were already years of history at that point. But yes, Ruby has a better history with packaging, similar to Perl. Python was kinda always copying Perl and then Ruby in what they did, but still aiming to preserve their own identity and requirements. Which is also why nobody wanted to copy the cargo-cult-pain of the Java-Enterprise-World.

> Meanwhile Python has someone making a serious attempt at a new package manager every few years.

Many People have many attempts at many things in many communities. Doesn't mean they are important. Heck, vim and zsh have also dozens of different package-managers, and they have less use cases and problems to solve then python.


> pip wasn't pythons first package-manager, and obviously wasn't it just created with version 1.0. There were already years of history at that point.

All the more reason it should have been so much better if it has already had time to learn from & iterate on past efforts.

Don't get my wrong, there's plenty of examples of projects ignoring well-known best practices (see the community backlash at Cargo's approach to name spacing or Homebrew loudly ignoring community feedback on FS permissions for 10+ years), but at this stage in 2024 you'd think we'd have gotten a little further than we have.


Because packaging is a very complex problem, and it's rare that any one packaging solution can get everything right in the first try.

You will notice that every package management solution from all your favourite languages will have its own set of tradeoffs.


Packaging is hard but it's not hard enough to wholly explain pip.

Lock files alone are a proven piece of technology that pretty much just works across all modern package managers, and yet the best that pip can recommend to date is `pip freeze > requirements.txt`, which is strictly inferior than what is available in other package managers because there's no hash verification and no distinction between transitive dependencies and explicit dependencies.

That pip still doesn't have an answer for lock files is a sign of a deep problem in the Python package management world, not the natural result of packaging being hard.


My understanding is that pip developers are unpaid volunteers with other duties. That's what they frequently say at least. PSF rakes in a decent amount of coin every year but prefers to spend it on "outreach" instead of gaping wounds like this.

In short, python-dev leadership just doesn't care. Can't fix that with technical solutions.


Lockfile does not work perfectly for me because it does not lock python version. Using a lock file with different python version or even same version but different OS may fail.


That’s literally his point. Other languages have solved this. It makes no sense that it’s not solved in python. (If your not sure how look at a golang go.mod file or a rust Cargo.toml file.)


There's no reason why Python version couldn't be baked into the lock file or package manifest if pip had one. package.json does this [0]. Since all pip has is requirements.txt, it's a bit of a moot point.

[0] https://docs.npmjs.com/cli/v10/configuring-npm/package-json#...


Sounds like you want a nix flake.


pip has had constraints.txt forever, it is equivalent to lock file. Your are not supposed to freeze into requirements.txt.

Hopefully uv can make this the default behavior. It seems like the majority of users are not aware of its existence because it’s an optional flag.


> Your are not supposed to freeze into requirements.txt

ironic that pip freeze literally generates a requirements.txt with ==

constraints.txt certainly did not exist back when I was doing python.

Conversely Ruby packaging has been a solved problem for a decade, when the python community has been extremely resistant to conceptually similar solutions for the longest time on strange ideological grounds, and came around only recently.


Somehow I've managed to go all this time without ever having heard of this feature. If this is the blessed path, can you explain why the pip docs recommend freezing to requirements.txt [0]? And why does the documentation for Constraints Files in the next section talk about them as though they're for something completely different?

Here's what they say about requirements:

> Requirements files are used to hold the result from pip freeze for the purpose of achieving Repeatable Installs. In this case, your requirement file contains a pinned version of everything that was installed when pip freeze was run.

Here's what they say about constraints:

> Constraints files are requirements files that only control which version of a requirement is installed, not whether it is installed or not. ... In terms of semantics, there is one key difference: Including a package in a constraints file does not trigger installation of the package.

> ... Write a single constraints file for your organisation and use that everywhere. If the thing being installed requires “helloworld” to be installed, your fixed version specified in your constraints file will be used.

> Constraints file support was added in pip 7.1. In Changes to the pip dependency resolver in 20.3 (2020) we did a fairly comprehensive overhaul, removing several undocumented and unsupported quirks from the previous implementation, and stripped constraints files down to being purely a way to specify global (version) limits for packages.

This sounds like something vaguely similar to a lock file, but they seem to intend it to be used globally at the organization level, and they're certainly not pushing it as the answer to locking dependencies for a specific project (they specifically recommend requirements for that). Maybe you can use it that way—although this Stack Overflow answer says that the 2020 update broke it for this use case [1]—but even so it doesn't answer the fundamental need for lock files unless everyone is actually on board with using it for that, and even the pip maintainers don't seem to think you should.

[0] https://pip.pypa.io/en/latest/user_guide/#requirements-files

[1] https://stackoverflow.com/questions/34645821/pip-constraints...


This is absolutely true, but I haven't seen any language ecosystems that have gotten things wrong as often as Python.

And quite a few where the tradeoffs are minor enough to be well worth some sanity & consistency.


C++. I think Java is pretty close in nastiness but I haven't used it in a while.


Java is Maven or Gradle mostly. Neither perfectly ideal but there's clear choices by subcommunitues (e.g. Android =>Gradle) & churn is low. Not at all an awful situation.

C++ is mostly unmanaged in practice which is obviously bad (though Python is the only "modern" language where I regularly see published projects without so much as a requirements.txt so very comparable). However looking at the actual options for C++, Conan seems pretty clear cut & not awful in itself.

So... not as bad as Python by any measure imo.


I think maven and pypi + pip are similar levels of pain.

Like plenty of people just use pip or poetry and are fine. It’s totally bad faith to claim that this is all orders of magnitude worse


The package manifest for maven is pom.xml. Sure it's got quirks - I might have the wrong version of a maven plugin it's got required directives for or something - but it's always there, it's declarative, and dependencies are listed in a consistent, predictable, universally parsable format (I don't even need maven, it's XML). It's also highly configurable via .mvn at project level so if you check that in chances are you can work around many weirdnesses.

And it's always there. If your IDE didn't generate it your maven archetype did.

What's the source of truth in any random pip project that I should look to as a package manifest? requirements.txt? Setup.cfg? Pyproject.toml? Setup.py? What if they're all there? Do they all agree? Should I cascade or defer to one? Merge the trees? What if none of the above exist? Pip certainly doesn't ensure any does.

As for parsing them, requirements.txt is the most barebones near-informationless manifest out there, and setup.py is init code - good luck trying to automate reading the dep tree reliably without side effects.


It's because no-one is in charge anymore, and Guido never cared about packaging anyway.


If you think Guido never cared about packaging, try calling it "The Cheese Shop" in distance of his hearing. :-)


Wasn’t any better back then. Not much of a cheese shop, is it?


You haven't used Swift before I see


The recent CocoaPods SPM migration is almost as painful as py2->3 but it seems to be a clearly understood direction with end in sight.

SPM's manifests being non-declarative is also super problematic for 3P tooling but again - it's one well defined target. Decisions are clear, if a little cumbersome. There's nowhere near the mess of indecision and reduplication you see in python projects.


Python refuses to have packages (folders don't count) or accept build steps. Everything follows from there, making it much harder to have a good solution.


Can't even rely on a package to have statically defined dependencies. You can just make setup.py generate a random selection of packages!


Well, there are languages better at incorporating the Zen of Python than Python itself, vide (full disclosure: my blog post) https://p.migdal.pl/blog/2020/03/types-test-typescript/.

When it comes to package managers, as you noted, the situation is even more ironic, as nicely depicted in https://xkcd.com/1987/.


One way to do things becomes impossible to after thirty years unless one wants to make large, breaking changes. After Python 3:

    >>> from __future__ import braces
      File "<stdin>", line 1
    SyntaxError: not a chance


That's been around since at least Python 2.6, and I think I remember seeing it before then:

    $ LC_ALL=C NIX_PATH=nixpkgs=https://github.com/NixOS/nixpkgs/archive/release-13.10.tar.gz nix-shell -p python26 --run 'python -c "from __future__ import braces"'
      File "<string>", line 1
    SyntaxError: not a chance


“After Python 3” doesn’t mean it arrived then. It was supposed to be read, “after Python three … not a chance.”


Haha, didn't know about this easter egg, amazing


What's your view on how Golang deals with this problem? Serious question.


Golang users don’t sit around arguing about package managers, nor do they have trouble with trying to make sure they have the right version of go installed.

Early versions of go didn’t solve things well, but go has had this fixed for a long time now. Any arguments about packaging issues are likely related to older out of date problems people had.

The only real problem people have these days is around private non open source packages. You can do it, but you need to learn how to set environment variables.


Any plans to adopt Rye's approach of using standard Python builds to cover the installing of different Python versions?

I feel uv should provide a way to install different python versions to truly cover an end-to-end tool. The current approach of searching for existing virtualenvs of conda envs helps but I was hoping to completely remove the need for another package/dependency manager.

(Taken from the docs)

  If a --python-version is provided to pip compile (e.g., --python-version=3.7), uv will search for a Python interpreter matching that version in the following order:

  - An activated virtual environment based on the VIRTUAL_ENV environment variable.

  - An activated Conda environment based on the CONDA_PREFIX environment variable.

  - A virtual environment at .venv in the current directory, or in the nearest parent directory.

  - The Python interpreter available as, e.g., python3.7 on macOS and Linux. On Windows, uv will use the same mechanism as py --list-paths to discover all available Python interpreters, and will select the first interpreter matching the requested version.

  - The Python interpreter available as python3 on macOS and Linux, or python.exe on Windows.


Feel like I called this 11 days ago - https://news.ycombinator.com/item?id=39251014

> I had to guess, that’s the path that the Astral team would take as well - expand ruff’s capabilities so it can do everything a Python developer needs. So the vision that Armin is describing here might be achieved by ruff eventually. They’d have an advantage that they’re not a single person maintenance team, but the disadvantage of needing to show a return to their investors.


Looks awesome. I find that pip is usually pretty fast for me, and when it's not, it is mostly because it has to download so much data or wait for native libraries to compile in the background (or anything involving cuda which always seems to take forever). What really needs some help with speed is conda, which is just so absurdly slow for literally anything, even on ridiculously powerful machines.


For conda there is mamba [0], a drop-in replacement that's really fast.

By the way, the creator of mamba started his own company at https://prefix.dev/

They want to essentially leverage the conda(-forge) infrastructure to build a new cross-platform, cross-language, cargo-like package manager: pixi

[0] https://github.com/mamba-org/mamba


Since last year, conda itself is actually using the mamba solver:

https://conda.org/blog/2023-11-06-conda-23-10-0-release/


What are some of the reasons that teams use conda (and related tools) today? As a machine learning scientist, I used conda exclusively in the mid-2010s because it was the only framework that could reliably manage Python libraries like NumPy, PyTorch, and so on, that have complex binary dependencies. Today, though, pip install works fine for those packages. What am I missing?


For me personally, I prefer conda because it is dependency resolution (mamba), virtual environments, and a package repository (conda-forge) all from one base miniconda installation. And for all of my use cases, all of those just work. Dependency solving used to be painfully slow, mamba solved that. Packages used to be way behind the latest, setting conda-forge as my default solved that.

After fiddling with different solutions for years and having to start fresh with a new Python install, I've been using nothing by miniconda for years and it just works


Unfortunately, far too often: tradition.

Using only „Pythons native tools“ like pip and venv simply works nowadays so good that I wonder about the purpose of many tools like poetry etc. etc.


Has anyone else been paying attention to how hilariously hard it is to package PyTorch in poetry?

https://github.com/python-poetry/poetry/issues/6409


For me it's the easiest and fastest cross-platform way to consistently install a Python version.

pip and venv work fine, but you have to get them first; and that can be a struggle for unseasoned python devs, especially if you need a version that's not what your distro ships, and even more so on Windows and macOS.

I use micromamba [1] specifically, which is a single binary.

[1] https://mamba.readthedocs.io/en/latest/user_guide/micromamba...


Maybe it's because I came into Python later, but I've almost never had the problem of pip not being installed. That's what ensurepip is for, right?


Another reason I used to use conda was for easy native Windows installation. GPU accelerated packages like OpenCV were especially difficult when I used use it 6 years ago. Now there’s Linux subsystem.. has pip support dramatically improved on Windows?


The biggest advantage for poetry I found, working with a lot of non-traditional software people, is that it does a lot of things by default like pin versions and manage virtual envs. Unfortunately, it does complicate some things.


I can understand that well. A few articles from ByteCode! helped me to "follow my intuition" and do as much as possible with native Python tools.

https://www.bitecode.dev/p/back-to-basics-with-pip-and-venv

https://www.bitecode.dev/p/relieving-your-python-packaging-p...


Those are interesting pointers; appreciate it! My own experience over the past three years has been similar. I tried using Pipenv, and then Poetry, for internal projects at my company; in both cases the tool seemed overly complicated for the problem, slow, and I had a hard time getting co-workers on board. About a year and a half ago, I saw [Boring Python: dependency management](https://www.b-list.org/weblog/2022/may/13/boring-python-depe...), which recommends using the third-party `pip-tools` library alongside the standard library’s `pip` and `venv`, and switched to that for the next project. It’s been working great. The project has involved a small team of scientists (four or five, depending) who use a mix of macOS and Windows. We do analysis and development locally and write production-facing algorithms in Python packages tracked in our repository, and publish releases to Gitlab’s PyPI. For our team, the “get up and running” instructions are “clone, create a venv, and pip install -r requirements.txt” and for the software team that manages the production systems, deploying an update just means pip installing a new version of the package. Every team’s got different constraints, of course, but this has been working very smoothly for us for over a year now, and it’s been easy, no pushback, with everyone understanding what’s going on. Really impressed with the progress of the core Python packaging infrastructure over the past several years.


> Today, though, pip install works fine for those packages.

pip install works, but pip's dependency management doesn't seem to (for Pytorch, specifically) which is why projects that have pip + requirements.txt as one of their installation methods will often have separate pytorch installation instructions when using that method, though if the same project supports conda installation it will be a one-stop-shop installation that way.


> pip's dependency management doesn't seem to (for Pytorch, specifically)

That’s interesting — I’ve also had difficulties with PyTorch and dependency resolution, but only on the most recent versions of Python, for some period of time after they’re released. Picking Python 3.9 as a baseline for a project, for example, has been very reliable for PyTorch and all the related tooling.


One reason to choose one over the other is the dependencies they’re bundled with. Take numpy. With PyPI, it’s bundled with OpenBLAS, and with conda, it’s bundled with Intel MKL, which can be faster. See https://numpy.org/install/#


That’s a great point; I didn’t know about that!


Yeah, I'm curious how much uv is actually faster.

npm -> Yarn was life changing performance-wise.

I wonder what pip -> uv is.


Bun, ruff/uv, polars.. all have been major qol improvements.

I’m loving the pace of release with this crop of speed obsessed projects and I cannot wait for astral to tackle typing.


Ok, here are some benchmarks installing an application that has a couple build-heavy dependencies:

    Installation times (MM:SS):

    # No cache
    - uv:  01:05
    - pip: 01:56

    # Cache:
    - uv:  00:02
    - pip: 00:42


From my testing in rye it’s significantly faster in day to day. There are numbers in the blog post obviously but it’s the feeling you get using it locally that makes it much more fun to use.


I did some basic testing today and uv was around 2x faster than pip for a clean venv and cold cache on a real, decent sized project. With warm uv cache it was incredibly fast, under 10 sec.


npm seems to have gotten a lot faster lately. All that competition from yarn and now bun seems to have pushed them to focus on optimization.


Oh certainly.

Npm has improved greatly.

Npm features follow a two-step process: (1) Get implemented in Yarn (2) Get implemented in npm.


So uv uses pubgrub-rs, a rust implementation of the pubgrub version solving algorithm originally written for the dart language. I suppose I shouldn't be surprised, but I always loved Pub the Dart/Flutter packaging tool. It's the only thing out there that comes close to cargo that I know of. Fun to see these chains of inspiration reach across languages.


A lot of uncalled for pessimism here in the comments. If python folks reading haven’t used ruff yet, i can highly recommend it. The astral folks have proven themselves to me already. Looking forward to more and more of their rust built python tooling.


Gonna put my pessimist hat on and say, great - another python package manager. If it was at least a drop-in replacement for pip, we could switch to it as we did to ruff just for the speed gains.

We need a permanent solve for python package management, governed by the python steering council.


Literally the first words in the article says that it's a drop-in replacement for pip.


And the following examples show it is not. While it is pretty straightforward to add `uv ` in front of any pip calls, it is not API compatible.


Granted they are still working on it, but in the meantime - add a shell alias for uv?

alias pip="/usr/bin/uv pip"


How does that solve the problem that uv isn't a drop-in replacement? Are they going to implement the whole of pip, warts and all? Unlikely, because even though its in rust, they're getting a fair bit of speedup by making assumptions (see their ruff benchmarks, most of pylint isn't implemented), and as we've seen with both poetry and pipenv, those assumptions break down. pixi may get somewhat closer (given their experience with conda, and so familiarity with workflows and needs outside webdev), but I suspect uv will only further add issues and confusion to Python installs (especially if people decide to try aliasing and things break).


I jumped to conclusions in my previous comment, but that was based on a personal experience. Realistically, I've only ever needed 2 commands: `pip freeze > requirements.txt` and `pip install -r requirements.txt`. They don't need 100% API compatibility to deliver value, just "good enough". 80/20 rule and all that.


The issue is, Python has multiple projects that are already doing that, and they've not been adopted (and then everyone complains that Python packaging is a mess...). Maybe uv will be like the easy_install -> pip transition, but I doubt it (the requirements are much higher and more diverse).

Arguably they'd get significantly further choosing a specific and narrow use-cased and nailing it (e.g. deploying WSGI/ASGI apps via OCI containers, with the ability to have live reload during dev), rather than introducing another option whose main selling point is speed, which while nice, is hardly going to resolve why we have the different use-cases and hence solutions.


I can't use it unless it implements list -o and install -U


take it all the way and force integrate it into venv


And you just took a VC-backed company at its word on a marketing article? I'm sure that's what they're aiming for, but at this time, it's not


The PSF has been asleep at the wheel for the past 10-15 years. The python packaging ecosystem has only gotten more fragmented under their stewardship.


Fully agree - they need to get it together


A lot is happening these last few years, it's literally exploding. I think it will start to consolidate.


Isn't it a replacement for pip? I've seen people on Twitter saying that they migrated very quickly. TFA:

> TL;DR: uv is an extremely fast Python package installer and resolver, written in Rust, and designed as a drop-in replacement for pip and pip-tools workflows.


I tested it on my repo, multiple errors, it's not a drop-in replacement.

  (.venv) lijok@mbp icebreaker % python -m uv pip install --upgrade uv setuptools wheel

  error: unexpected argument '--upgrade' found

  tip: to pass '--upgrade' as a value, use '-- --upgrade'

  Usage: uv pip install [OPTIONS] <PACKAGE|--requirement <REQUIREMENT>|--editable <EDITABLE>>

  For more information, try '--help'.

  (.venv) lijok@mbp icebreaker % python -m uv pip install -- --upgrade uv setuptools wheel

  error: Failed to parse `--upgrade`
  Caused by: Expected package name starting with an alphanumeric character, found '-'
  --upgrade
  ^

I'm sure it can be made to work - but it's definitely not drop in


What happens if you don't prefix with `python -m`? There's no mention of doing that in the blog post and it certainly feels wrong to call a rust package with python -m


That's specifying which python to use. The issue which is printed is there's no support for `--upgrade`, so it's not a drop in replacement for pip.


And there no support for pip list -o


This is very exciting! Congratulations to the Astral team on their work here.

I have historically expressed trepidation towards external attempts to "fix" Python packaging, largely because each attempt has historically accreted another layer of incompatibilities and hacks onto the giant pile of hacks that are already needed to keep the ecosystem running. As such, it makes me happy to see that compatibility is a priority here: the Astral team has gone out of their way to emphasize both formal (PEP) and informal (pip CLI) compatibility with existing tooling and standards.

(Separately, I share concerns about VC funding and sustainability. But everything I've seen so far indicates that Charlie and the other folks at Astral are not in it to screw or otherwise leverage the Python community. So I choose to be optimistic here.)


Prioritizing compatibility with a giant incompatible pile of historical hacks is how you grow the pile


I don't think that's true, at least in the case of Python packaging. The pile has historically grown because of informal or non-existent standards, along with multiple independent implementations attempting to align their observed (but not specified) behaviors.


Well, they do explicitly list certain historical features that they don't intend to support, like installation from eggs or editable installs. So they're doing some work to trim the pile while they're there.


They do support editable installs, but only from local directories. So you have to perform git clone yourself and then install.


We do actually support direct installation of Git and direct URL dependencies. Like, you can do run `uv pip install` with:

``` black @ git+https://github.com/psf/black ```

We also support editable installs for local directories, like you mentioned:

``` black @ ../black ```

The thing we don't support is using _editable_ installs for Git and direct URL dependencies, like:

``` -e git+https://github.com/psf/black ```

In my experience, these are really rare, and we didn't see a reason to support them.


Thanks for the correction, both of you. :)


Isn't this basically what pixi wants to be? Wouldn't it be better to work together?

https://github.com/prefix-dev/pixi/


Pixi is even more ambitious but with a different philosophy (I think? Specifically thinking about how pixi went with their own lock and manifest format as well as first class conda support). I'd definitely prefer if they worked together instead or instead dedicated their time to type checking python which imo there still isn't a great solution for.


Especially the conda support is IMO a cool thing.

As conda/pip interop is just not great. And even micromamba (C++ implementation of conda) is relatively slow to resolve compared to pip.

But agreed either work together or create a type checker. I use mypy currently but it definitely slows down my editor.


Yeah, pixi has decent mixed support for conda/pypi, it currently solves and installs conda (then locks it) and then solves and installs pypi. I think it's on their roadmap to solve them together, which would be a killer feature.


I have been working on a faster type checker for 3.5 years now. It's coming, I promise :)


Have you tried Pyright? It made me actually enjoy Python development.


For type checking it is very good, however, the error messages are sometimes not very human-readable. For example:

    def _deep_merge(updates: dict[str, str]) -> None:
        for key, update_info in updates.items():
            if isinstance(update_info, dict) and "values" in update_info:
                value = update_info["values"]
 
Errors out with Pyright:

    - error: Argument of type "Literal['values']" cannot be assigned to parameter "__key" of type "SupportsIndex | slice" in function "__getitem__"
        Type "Literal['values']" cannot be assigned to type "SupportsIndex | slice"
      "Literal['values']" is incompatible with protocol "SupportsIndex"
        "__index__" is not present
      "Literal['values']" is incompatible with "slice" (reportGeneralTypeIssues)
1 error, 0 warnings, 0 informations

It took me a great amount of starring to figure out that changing the signature of updates to dict[str, dict] was what it was complaining about.


I don't know if I'd use the word "enjoy", but it certainly makes python tolerable. With all due respect, it blows mypy out of the water.


Yes but pixi is setting itself for failure because it attempt to mix anaconda and pypi packages, which are fundamentally incompatible.

Hence they will always trigger user errors, and their image will be stained by it.


Iirc, some pip packages require compilation which depends on entire toolchains with e.g. gcc, g++, and with dependencies like gtk, Qt, etc. How do they intend to make that less error prone and thus more user-friendly?


Wheels have been a thing for a decade+. If not available from the developer the complexity doesn't reduce much.


That’s the really pointy end of packaging. Pillow (python +c+ External dlls) produces something like 50 wheels per release, and we’ve still got platforms where you’ve got to compile your own. Since they tend to be lower volume, they also tend to have more interesting compiler issues.


Yes, I'm using Python on an nVidia Jetson system with PyTorch, which depends on CUDA, and all my other dependencies are quite challenging to get working in concert. Every time pip3 starts compiling under the hood, I'm just praying that it will work.


That's exactly what conda/mamba/micromamba is covering: the whole toolchain.


I've used conda/mamba, but often ended up with broken installs (random coredumps) that were mysteriously fixed by just using pip3 instead.


That's what they attempt to cover, but they're not super successful at it.


Initially I was confused by the business model here, but I eventually figured it out --- improved python tooling is going to accelerate AI so much that they can just rely on the future gratitude of the machine god.


There are a couple of promising tools written in Rust looking to replace Pip for most users.

Rip (https://github.com/prefix-dev/rip/issues) which is more of a library for other rust tools to be on top of like Pixi (which is looking to replace both Pip and Conda).

And now uv, which seems to be looking to replace Pip, Pip-Tools, and eventually Poetry and PDM.

A lot of the explosion in tools in the Python world is coming from the desire for better workflows. But it has been enabled by the fact that build configuration and calling has been standardized and tool makers are able to follow standards instead of reverse engineering easy install or setup tools.

I know a lot of people are put off by there being so many tools, but I think in a few years the dust will settle and there will emerge a best practice work flow that most users can follow.

As a primarily Python developer and someone who occasionally contributes to Pip to solve complex dependency resolution issues it does make me wonder if I should hang my hat on that and learn enough rust to contribute to one of these projects eventually.


My experience with Rust developers who dwell in Python land is that they fundamentally disagree with most of the language that is Python and think they know better than incumbents what belongs and what doesn't.


The Rust ecosystem gets so much right that honestly, even as a career-long Python developer myself (and Rust for many years, but that's less of my point), they honestly probably do know how to build a good devxp better than much of the Python ecosystem.

Put other ways: the Python ecosystem has had 30+ years to figure out how to make packaging not suck. It has continually failed - failed less and less over time, sure, but the story is still generally speaking a nightmare ("throw it all in an OCI container" is an extremely reasonable solution to Python packaging, still, in 2024). I welcome advances, especially those inspired by tooling from languages that focused heavily on developer experience.


Tbf, being able to start from scratch makes it much easier to get those things right.

Being compatible with the mess that exists is where the difficulty comes from.


To be fair, python package management is in such a poor state that I would expect any outside opinion to not be worse.


I have written a couple million lines of python professionally. Some of that is still in prod.

The last 5 or so years I've mostly done work in rust... so I guess I'm a rust developer now.

Do the last 5 years invalidate my opinions, thoughts, or skills w.r.t. python somehow?


I'm surprised anyone who has gone from python to rust still finds contributing back to the python ecosystem worthwhile.


There's a lot of important work that happens in python. Most of it isn't being done by software engineers. I think the idea of improving things for that group is plenty meaningful.


I've been dying to see what they came up with next given how in love I am with Ruff. In my wildest dreams I didn't expect it to be a package resolver. I'm very excited about this.

edit: As a side note, if anyone from Astral happens by, any chance of an RSS feed on your blog?


I was kind of hoping it was a better type checker, something like mypy + pyright. To be fair that's an incredible amount of work, so maybe we'll see that later. Still very excited.


That was actually my first guess, and while I certainly would love to see that, packaging is such a headache for so many people that this could be amazing. Even with poetry I still occasionally get mysterious errors and wildly wrong resolver claims about versions not existing.


Why include mypy at all? Pyright is amazing and all you need.


Exciting stuff! I view Hatch [1] as becoming the Cargo for Python because it's already close and has an existing (and growing) user base but I can definitely see depending on this for resolution and potentially not even using pip after it becomes more stable.

[1]: https://hatch.pypa.io/latest/


> Like pip-compile, uv generates a platform-specific requirements.txt file (unlike, e.g., poetry and pdm, which generate platform-agnostic poetry.lock and pdm.lock files). As such, uv's requirements.txt files may not be portable across platforms and Python versions.

really curious on the reasoning behind this decision :)


Hard to give a concrete example but you can end up in dependency deadlocks, with combination of packages that require new features vs packages that don’t work on newer versions.

Mostly it’s useful right after a new release where prebuilt wheels aren’t available for all packages yet and you have users who may care or not about this.

If you only have single requirements file, you get forced to choose which platform to support. With multiple, you can break out if needed. These breaking changes and deadlocks are rare. It’s still good to have an escape hatch.


https://github.com/astral-sh/uv?tab=readme-ov-file#multi-ver...

says:

> uv does not yet produce a machine-agnostic lockfile.

so maybe the non-portable requirements.txt is just a first milestone


I'd be curious to see how "packaging for production" is addressed. As described, this is a tool for development purposes:

> Think: a single binary that bootstraps your Python installation and gives you everything you need to be productive with Python, bundling not only pip, pip-tools, and virtualenv, but also pipx, tox, poetry, pyenv, ruff, and more.

A lot of that dev tooling is not needed in prod, in fact it is a liability (features available for misuse, code/image size, higher resource requirements). Would there be a "uv-prod" to only deal with the minimum subset of environment setup / container building that enables production use? Would uv build packages that are runnable in prod but themselves don't contain uv? It'd be interesting to hear the plans.


My approach for this is to use multi-stage dockerfiles. Build tools live and operate in a build stage, and you copy the /venv into your runtime stage. I think this same approach should work with uv so long as it doesn't bundle itself into the venv.


So what's the plan for monetization? NPM getting acquired by GitHub was an anomaly, and I'm wary of letting VC-backed companies becoming ecosystem lynchpins.


I hope they answer this ASAP. People in the Python community are already concerned that Astral is following the Embrace, extend, and extinguish playbook.


If all they did right now was leave Ruff exactly as it was and walk away they would have given the Python community a great gift.

Sure, it’s rendered a bunch of projects obsolete, but it’s because it’s so much better than the predecessors.


If they stop development on Ruff today, Ruff won't be useful anymore after a few new Python releases. If the maintainers of the tools that Ruff is replacing stop maintaining them, the whole Python community will be in a really bad place. So, there is reason to be cautious. Astral being transparent on how they plan to make money would be very helpful.


> If they stop development on Ruff today, Ruff won't be useful anymore after a few new Python releases

That is true for every single development tool in the Python space.


I do get the caution, but I think that even in that case there’s enough momentum behind Ruff for the community to fork and carry on.

Obviously it would be pretty crummy to end up in a situation like pyright where MS really have leant in to their embrace, extend, extinguish strategy.


People could just fork it if that happens. Since the foundation of the tooling is solid, I 'm sure the community would rally around a fork that keeps it going.


“People in the Python community”?

Yeah. Probably precisely the same people that call everything EEE.

You’ll always find people acting alarmist about anything.


It's MIT and Apache dual licensed. If the company fails, you fork and move on. If the company never makes a dime, that's OK for me: some folks got paid to make open source tooling.


What they are doing is already discouraging people from contributing to the projects their tools are replacing[0]. If they go out of business and stop supporting their tools, it might leave the Python community in a bad place.

[0] https://www.youtube.com/watch?v=XzW4-KEB664


That video is.. weird. It claims that astral isn't contributing back despite the entirety of their code being permissively licensed. It's also sort of baffling that making tooling more accessible doesn't seem to be considered contributing back.

I'm not sure what the maker of that video wants, does he want money to be poured back into the community, or for no one else to make money?


I'm not sure this is a reasonable framing: LLVM stole much of GCC's thunder by being significantly easier to contribute to (and extend externally), but I don't think it would be accurate to say that LLVM is "discouraging" people from contributing to GCC.


I'm not sure if you can compare a project that came out of a university and got adopted by Apple with a project developed by a VC backed company with no revenue. I'm sure Charlie has the best intentions with both ruff and uv, but we have no idea how this is going to play out.


My understanding of Ruff's history is that it predates Astral, and was originally a side project by Charlie. I have similar reservations about VC funding and sustainability, but I don't think Ruff was created as a trojan horse or with a secret takeover plan in mind.

I agree that we have no idea how it'll play out. But I've seen nothing but good faith from Charlie and others, and I see no reason to preemptively condemn their attempt to build a business.


LLVM was at least written in the same language it is compiling.

In this case they are replacing Python code with Rust which might exclude large part of Python community from being able to contribute.


By this token, we should be concerned that CPython is not written in Python. But that historically has not posed a significant risk to CPython's longevity, nor to the ability of motivated contributors to learn the C necessary to contribute to it.

(Or another framing: only the tiniest fraction of the Python community was contributing to CQA tooling in the first place. It's not clear that prioritizing the interests of the 99.9% of Python programmers who -- rightfully! -- will never modify these tools makes sense.)


The slow death of GCC has been unquestionably bad for copyleft software.


Why is it LLVM's job to be good for copyleft software?


It's not. But some suspicion of successor projects is probably warranted by the devs and users of existing tools.


Wouldn't their failure re-encourage people to contribute?


What is wrong with pip that people thought we needed conda, and all these other managers?


If you want to learn every lesson a package manager author has learned in the last twenty years by way of counterexample, use pip.

Global dependencies by default (fixed by venvs). Shipping precompiled binaries that are not portable across platforms or python versions (kinda fixed by wheels, I forget the PEP number). No lock files with checksum verification by default (kinda fixed by requirements.txt with require hashes, but not really). Also, there's a bunch of weird shit that can happen if the pip version and python version disagree due setup tools.

It's got a great interface for installing stuff. But the likelihood that what it installs breaks something on your machine is pretty high the longer you use it.

Edit: the real, fundamental problem with pip is that it can't be saved. Too many dockerfiles and server provisioning scripts are relying on it - it's interface, semantics, CLI output, etc. It's not 100% broken, which is why it's still useful. But any true fix has to be done by creating a separate tool.


A long list of terrible defaults with workarounds that get you halfway to a decent package manager:

* Defaults to installing packages globally instead of per project.

* requirements.txt feels very much like a second class citizen when you contrast it with package.json, Cargo.toml, build.gradle, pom.xml, or most other dependency systems in other package managers.

* No native support for lockfiles short of pinning every version of every project in requirements.txt. This solution is inferior to what is available in ~every other package manager, because a pip freeze doesn't distinguish between explicit dependencies and transitive dependencies.

I'm sure there are others, but they're all along the same lines—pip is strictly inferior than basically every other package manager out there for other languages. Things that people expect to be part of a package manager have ugly hacky workarounds instead that aren't uniformly applied across projects.


Imperative dependency (mis)management. Pip resolves what needs to be installed while it installs things depending on the setup script of the package. It requires a full execution and it is neither repeatable nor reproducible. So it is impossible to know what needs to be pulled from Pypi ahead of time.

Pip is also strongly tied to the specific Python version. If you upgraded Python interpreter on a system, congrats all virtual environments and all PyPi packages are broken now. Hopefully you saved the minimum set of requirements and they don't have some catastrophic dependency breakage. You run pip install and pray.

What? Native binaries? Building them on the fly while installing a pip package? Oh you're brave.


> upgraded Python ... all virtual environments and all PyPi packages are broken now.

Misleading. Continue to use the older version, or packages will need to be installed in a new folder for the new version. Recreate venvs and/or simply reinstall any --user.


Have you ever seen a pip failure that required you to go use a different package manager and then come back and try pip again? That's what.


I haven't. The only packaging problem I saw deploying python applications was with one particular package that was only available through conda.

I was able to build useful things with pip + virtualenv


Ah well I'm happy that you've managed to avoid it but there are a million stack overflow posts just like this one:

https://stackoverflow.com/a/34631976

Pip fails with some kind of "not found" error and then you have to use some other package manager to supply the missing thing.


Do you plan to support platform-agnostic lockfiles? From the wording it sounds like "yes" but also "this is a feature not a bug." Which is it?

Checksum verification and portable lockfiles are kind of table stakes for me, which is why I use poetry everywhere possible. I can't give up correctness for speed.

sidenote: this would be super useful as a library so I could plug it into build systems. Managing python dependencies in standards-compliant ways is a pain, so shelling out to tools like poetry and pip are used even though it would be nicer to write that code eg in Rust and just use the parts you care about.


Yes, we absolutely want to support platform-agnostic lockfiles. Very important thing.

The "feature not a bug" tone is probably because we intentionally limited the scope of this release to _not_ support platform-agnostic resolution, since it adds a lot of complexity (and we were still able to ship something that's immediately useful for those that rely on pip and pip-tools today).


While we don't particularly care about pip lockfiles, etc. as we're moving to the conda/mamba ecosystem a faster pip is most welcome. We invoke it a lot programmatically and it's very, very slow.


Something that is super super important to me is editable installs, and local references to those editable installs. Often I have several editable packages set up in a virtualenv which I think of as like my "base" virtualenv.

One reason I struggled with Rye earlier is that it kind of felt like this wasn't entirely supported, possibly because I couldn't parse the information on the workspaces or virtual projects page. Maybe someone else figured this out? It does seem common enough that I would be surprised if it wasn't supported.


This does editable installs, I tried it myself yesterday


I think having a good story for local workspaces is gonna be a key piece of this next phase of the project, where we grow uv from a "pip replacement" to a "Poetry replacement" -- or, phrased differently, to something that looks more like Cargo. (Cargo does a bunch of nice things here that I want to learn from.)


I think Rye actually does handle this mostly correctly (as the sibling comment said). I got through some of it here: https://github.com/mitsuhiko/rye/issues/671. I think it's very close to what I actually want (maybe not what Armin wants with multiversion).


Why is this written in Rust and not Python?

Does this show deficiencies of Python as a general purpose language?


I don't know if the world needs extra examples to prove that python is very slow but this is certainly one more


That has been discussed before, same question for ruff, rye etc.

Maybe it shows the challenges with using python for deployment and distribution of binaries.

uv, rye they want to install and manage python, so they try to solve the bootstrapping problem - should a tool that installs python require python to be installed, etc?


This company is rust-focused and they compete with fast implementations. Python has always leaned towards prototyping and reimplentation when performance is an issue.

They also seem to have money for rapid development which trumps poor community projects.


I've complained about the resemblance to the Ruby community's packaging conflict a decade ago almost two years ago now. I may have been in the minority, but was apparently not alone:

https://news.ycombinator.com/item?id=32116649

Python packaging has gotten asymptotically worse in the last 90 days from my experience having used the language for 20 years now and working on a multi-platform triumvirate of Lin-Mac-Win that has not changed in twenty years.

I can honestly say that, given how daily failures in the packaging ecosystem such as pip failing to build egg, pkg-config, and similar problems, I'm happy to see a solution from rust, because I sent an intern into the polars underworld 2 years ago to great success and have only grown my own reliance on rust infrastructure more broadly since.

This may very well be the only packaging innovation in python that I actually look forward to trying tomorrow. I'd already be working in rust full-time if I could only get past the syntax that reminds me too much of perl after bathing in the semantic whitespace light for this long.


Coupling with Rye is very interesting, as that or the PyBi idea seem inevitable to take over. Managing different interpreters is just too much effort which each tool is handling differently.

Of course, the PSF will remain silent on the issue and let yet another tool exist in the ecosystem. I really do not care who wins, but we need a standard to be decided.


Why do people here claim that the PSF has any leverage to stop tools being developed?


I don’t want the PSF to declare tools cannot be made. I want the PSF to pick a winner so we can finally consolidate on a workflow.

Just today, I was helping a beginner setup Python on their laptop. A nightmare collection of tooling complexity (so you have Python, but you need to worry about virtual environments, to do that, I use this combination of pipx+virtualenvwrapper, but to do actual projects, you need Poetry, but…) all the while noting that this is an opinionated workflow that I happen to utilize because of the tradeoffs that matter to me. Ask someone else, and they will definitely have a different configuration. Entire first lesson was just to get Python installed with tons of elided complexity.

I should be able to point to a document on Python.org that says, “This is the way.” Yet for decades, they have refused to take a stand.


Isn't it your job as an engineer to pick the right toolchain for the job?


I hadn't heard of Rye before I don't think, but sounds like a strange arrangement to me - basically the same project & idea, so the new one to start switched the old one to use it 'under the hood', 'took it over', and will replace it 'when the time is right'? Why.. any of that, why not just buy it and let it be uv instead of starting uv, or just leave it alone?


I wrote down a small little FAQ at the bottom, hope it answers some questions: https://lucumr.pocoo.org/2024/2/15/rye-grows-with-uv/


That is helpful, and fair enough, thanks!


Very cool! Of note, I made something along these lines a few years ago, although with a slightly broader scope to also include managing and installing python versions. I abandoned it due to lack of free time, and edge cases breaking things. The major challenge is that Python packages that aren't wheels can do surprising things due to setup.py running arbitrary code. (https://github.com/David-OConnor/pyflow)

For some more context, at the time, poetry and pipenv were both buggy, and had some limitations their teams listed as "wont-fix". The sort of thing where you hit a breaking bug immediately on certain systems. These have since been fixed.


This article says they don't support "platform-agnostic lockfiles" YET. And platform-agnostic lockfiles have been mentioned in several comments here, times too. I haven't encountered these before (with python, anyway) - how do they work platform-specific dependencies, e.g., binary wheels?

For instance, there are >30 dists for numpy 1.26.4 [1], with different artifacts for different platforms and python interpreters. Which of those hashes ends up in a platform-agnostic lockfile? Do you have to stick with sdists only?

[1] https://pypi.org/project/numpy/#files


All of them - poetry adds them as an array, and the right one is installed for the platform.


PDM and Poetry would include hashes for all possible wheels, and would also follow transitive paths for platforms other than the current host. E.g. if my host is Linux and I depend on foo, and foo depends on bar only on windows, and bar depends on baz, my lock file will still contain baz.


Very nice! Not long ago we got rid of a bunch of tools at work in favor of ruff. I'm looking forward to this. Specially when the time comes to be able to replace poetry with it, hoping it is a good implementation.


The most time poetry spends in resolving is downloading all the different versions of a package. Unfortunately, the dependency info is stored in the Python wheel. I can't imagine how uv wants to make the downloads faster.


For envs which are pets, who cares, but this will be great for those cattle situations.


Sign me up! A single binary to manage Python is something I have been hoping for.


Something I've been waiting to see from language package managers is more attention paid to security. I believe both cargo and pip packages can run arbitrary code as the local user the instant they are installed, and malicious packages have existed in the wild for years. I also recall a blog post where someone was scanning PyPI for malicious Python packages, only to realize that `pip download` also executed code.

Just downloading a library to a local source code directory should not cause arbitrary code to run on your system, and there should be safeguards in place so developers are not one typo away from a malicious package pwning their entire system. Supply chain attacks remain a huge area of concern.

Instead, when I "ctrl+f security" the homepages of any of these new packaging systems, I get 0 results. Not good.


> I also recall a blog post where someone was scanning PyPI for malicious Python packages, only to realize that `pip download` also executed code.

I think you're thinking of this post[1]. The code being searched for wasn't malicious, just buggy :-)

[1]: https://moyix.blogspot.com/2022/09/someones-been-messing-wit...


Thanks, that's the one! I was actually just trying to find it.

Heh, I forgot about the anime catgirl part.


While I don't see the immediate need to switch from using pip, I am such a fan of ruff that I'm always happy to check out what you guys come out with.

Please keep doing what you're doing!


This is great. Just adopted the pip-tools approach on a project and that has been great. Excited to give this a try.


An ability to override dependencies, finally!


Obligatory two xkcd's, but I'm super excited about this project! Definitely going to try it out.

https://xkcd.com/1987/ https://xkcd.com/927/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: