HACKER Q&A
📣 breckenedge

Why is Python package management still a dumpster fire?


Today I was trying to use Python to build some custom ZMK firmware, which relies on a package named west. For the life of me, I cannot figure out how to get it installed so that it's in my PATH. Why is python package management still this bad after so many years, with so many wonderful examples of how good a package manager can be?


  👤 stackbutterflow Accepted Answer ✓
This thread is textbook trolling to get an answer:

```

- I discovered that you'd never get an answer to a problem from Linux Gurus by asking. You have to troll in order for someone to help you with a Linux problem.

- For example, I didn't know how to find files by contents and the man pages were way too confusing. What did I do? I knew from experience that if I just asked, I'd be told to read the man pages even though it was too hard for me.

- Instead, I did what works. Trolling. By stating that Linux sucked because it was so hard to find a file compared to Windows, I got every self-described Linux Guru around the world coming to my aid. They gave me examples after examples of different ways to do it. All this in order to prove to everyone that Linux was better.

- ion has quit IRC (Ping timeout)

- brings a tear to my eye... :') so true..

- So if you're starting out Linux, I advise you to use the same method as I did to get help. Start the sentence with "Linux is gay because it can't do XXX like Windows can". You will have PhDs running to tell you how to solve your problems.

```


👤 brundolf
You have to get these things right early-on, and get everyone on the same page. Once things start to get fragmented, once you have historical decisions that prevent you from improving certain aspects, etc, there's only so much you can do to after the fact to paper over the gaps.

Python's packaging story is rotten in its bones. I think at this point it's nearly impossible to fix (though many continue to try).

The way I see it, a solution would require:

- A cohesive, self-contained, easy to use, all-inclusive toolset, with a completely standardized manifest format for packages. And, for all that is holy, doesn't mess with the global environment so that we have to lie to it with virtual environments.

- That toolset would have to get overwhelming adoption by the community, to where it retroactively becomes The Python Package Manager (outside of legacy codebases, which would continue to languish). This would probably require endorsement by the Python developers, or for the tool to be so unassailably better that it's preferred without exception, or possibly both. Otherwise we'll continue to have: https://xkcd.com/927/

I want to emphasize that on the second point 50% adoption is not enough. 70% adoption is not enough. I'm talking 90%+ adoption for libraries and new projects; everything outside of it needs to be a rounding error, or we're still in fragmentation hell.

And then even in the best case- there would be a long tail for years to come of packages that haven't been ported to the new tool/manifest format, where you have to duck out to one of the other package managers to pull them in.


👤 mr_mitm
What do you mean? You type `pip install west` and a binary will appear in `~/.local/bin/west`. This directory needs to be part of your $PATH variable. Done.

Using virtual envs or pipx or poetry or whatever are non-standard.

Packaging Python projects, however, ... don't get me started on the whole `setup.py`, `setup.cfg`, `pyproject.toml` debacle. This article has more information about it, but the fact that this is supposed to be the short version makes it even more infuriating: https://blog.ganssle.io/articles/2021/10/setup-py-deprecated...


👤 julienpalard
I don't see the dumpster fire here:

$ python -m pip install west

That's it, no venv, no sudo.

pip will install it in ~/.local/ so you'll need a:

    PATH="$PATH:$HOME/.local/bin"
to get it to your path.

👤 fxtentacle
I find python packages quite easy to work with :) plus they're similar to ruby gems.

You create a venv or bundle, list requirements in a text file, then ask it to install things for you.

And if you need custom stuff, you can just pip install a .whl file, too.

I have yet to encounter a case where it's not working as expected, so my answer would be that python isn't getting fixed because it's not broken.

wontfix, works for me


👤 shantnutiwari
Why? Just read the comments here. A combination of hubris and misunderstanding of how hard it is .

"Just do pip install" --> Whenever I hear this comment, I know Im talking to someone who has never used scientific libraries like numpy, scipy etc. Never seen the problem of dependency versions going into a mess because Pip by default doesnt pin dependencies (Poetry does, but it is not standard).

Python packaging is a mess because for some weird reason that baffles me, a large majority won't even admit tehre is a problem. And they will start jumping on you and calling you an idiot if you say there is. A lot of gaslighting going on here.


👤 simonw
I think Python package management is pretty good now. The absolute toughest bit is learning how to work with virtual environments, but once you have a string understanding of that I think the system works very well.

👤 franga2000
I really don't see the problem here - you have two options, both of which involve one install command and at most a one-step setup process:

1. A user install: `pip install [package]` and make sure "$HOME/.local/bin" is on your PATH

2. A global install: `sudo pip install [package]` - it will be installed to a dir on your path already (/usr/bin I think)

As for why pip si not ideal for installing software: it's not supposed to be. It's a Python package manager, not a software package manager. It's meant to install libraries for the Python interpreter to import, not commands for you to run. Of course, people do often library managers to install software (npm, cargo, go...), but the experience is the same in all of them - either you install with sudo and it "just works", but might cause problems later, or you install in "local" mode which requires you to add a the package managers's directory to your path.


👤 vegai_
You need three things usually:

- virtual environment (python3 -m venv --help)

- pip for manually installing things

- poetry for declaratively installing things

For a UX point of view, this is already pretty good and tends to work well. What sometimes makes this awkward and inefficient is that many python projects either don't declare their dependencies at all, or declare them with very specific versions. This makes it apparently a necessity for the resolver to do things quite heuristically.


👤 cik
I think the reality is that it's misunderstood and misused. Perhaps this is an education problem.

1. All virtualenv related things are not package managers. They're isolation tools. They ensure your application controls its environment.

2. Pyenv, asdf, and others provide the same benefits for versions of Python. They're incredibly powerful tools for those of us who need to support multiple versions.

3. Tools like conda, poetry, pipfile, etc serve to solve dependency management based on how you, as a user want to interact.

Given my needs, I find poetry invaluable (and I use none of its venv integrations). For specific projects I rely on pip, as that meets the goal there.

It could be better, but IMHO poetry and friends have drastically improved the situation over pip freeze.


👤 rcarmo
I don’t get it, honestly. I have been using Python since 1.2 and never had this kind of problem since we moved to pip.

That may be because I use virtualenvs and pyenv extensively, but even without that as long as you understand how pip works and where it places packages and binaries for a non-root user, it is mostly a matter of setting PATH in your shell (I do it with a conditional to see if .local/bin exists) and you’re done…

I also have historically had very few issues with dependencies (other than lack of some OS libraries to rebuild a wheel when, say, I’m using Pillow on ARM and it needs to find libjpeg), and yet there is a constant stream of complaints here on HN… I wonder why.

Would it be the OS? (I use Mac, Fedora and Ubuntu, and just sync my projects across all machines with local virtualenvs - everything runs everywhere, but I don’t use Windows for Python development)

Is it specific packages? Complaints seem to be generic.

Or is it just Eternal September all over again?


👤 PaulHoule
Because a system that almost works is an enemy of a system that works.

👤 Saphyel
I had been trying to push this PEP ( https://peps.python.org/pep-0582/ ) which requires a very minimal work and it will help a lot of people and it seems stuck

So I'm not really surprised PIP is pretty much abandoned/dead. No one wants to introduce changes or improve it, it is easy to think: it works (for me), so why change anything?


👤 makeitrain
I joined a hackathon this weekend on a Python project. I’m brand new to Python, but I know how to code and set up my Linux environment (so I thought).

First, I install python3 packages I think I need from apt and set up vscode plugins for Julia notebooks. Great,I have a repl environment and can use Python library we’re working on.

Now I want to hack the library code. The project docs say to run “just make”. Ok, that’s a rust tool, so I’ll install cargo via apt and get that installed.

The Justfile install just has a pip install with dependencies listed, which appear to be defined in pyproject.toml. I run just install, and at the end it throws errors about dependencies.

In the end, I used conda to set up the environment and get everything installed. But it seems like if the environment doesn’t have something, it gets it from the system instead of installing its own.

The project used setuptools. The python3-pip apt package requires puthon3-setuptools v59. PEP 660 adds editable installs, implemented in setuptools v64.

My takeaway is that you should not use your system installed Python packages for development, and should install as few as needed by your system or else you have to set up more overrides in your project.


👤 gmt2027
The python packaging situation could be better but poetry is pretty good.

I think the real problem with python packaging is a cross-language one. Show me a language that can elegantly handle building C/C++ dependencies across platforms. This is where things really break down. Python is unusual in this because it relies on C/C++ libraries for performance in a lot of domains.


👤 js2
For Python packages that include something you want in your PATH you can use pipx:

   $ brew install pipx # (1)
   $ pipx ensurepath
   $ pipx install west
1. https://pypi.org/project/pipx/

Alternately you can create a virtual env just for west:

    $ python3 -m venv west
    $ west/bin/pip install west
    $ west/bin/west
If you installed west with `python -m pip install west`, then `west` should be the same place in your PATH as `python`. You can probably also run it with `python -m west`.

👤 ghusto
Is this more "Someone explain what the `PATH` is", because that's what it sounds like. It doesn't sound like this has anything to do with package management.

👤 kwertzzz
One solution that I often hear about Python packages is the use of wheels. But I am wondering how this can work when multiple package depend (directly or indirectly) on a C library like zlib. As far as I know, there is no authority that decides which zlib version should be used. So I package A uses zlib.so.1 but package B uses zlib.so.2 can both be loaded at the same time ? (the version numbers are just examples for two incompatible version numbers)

👤 PartiallyTyped
My actual issue is the absence of namespaces. It's a relatively small thing but it ensures that the odds of making an error when installing something are much lower simply because one will have to err in both the provider and the package.

Furthermore, it allows us to fork stuff without having to also change the name. Sure, installing via pip and git solves the issue, but why not inside PyPI?


👤 jastr
The best way to get software engineers to solve your issue with X is to phrase your question “Why is X so horrible? It can’t even do Y.”

The engineers will inevitably reply “That’s so simple. You just need to …”

The ecosystem for managing python dependencies has improved a lot: pyenv, virtualenv, poetry.

PATH isn’t innate to Python. Understanding PATH will definitely help with other issues in the future.


👤 dvfjsdhgfv
The fact that you don't know how to install a package so that it's in your path doesn't mean Python package management is a "dumpster fire." In most cases it works far better than I had expected and I can only admire the amount of effort that went into this.

👤 grp000
For any python project bigger than trivial, I literally just build a docker image and run it from that.

👤 BiteCode_dev
The dirty secret about Python packaging problems is that a lot of them... are not relatated to packaging.

Python has first and foremost a big __bootstrapping__ problem, and the consequences of that are often seen when you try to install or use dependancies. So people conclude that packaging sucks.

Now don't get me wrong, packaging have some problems, but compared to most languages out there, it's actually pretty good.

But people cannot enjoy that, because they would have to install, setup and run python in a perfect sequence.

So what's wrong with Python bootstrapping, and why does it affect packages?

Well, as it is often the case in FOSS, it stems from too much diversity. Diversity is tooted as this one great thing, but its benefits also come with a cost.

- there are too many ways to install Python. Official installer, anaconda, Windows Store, pyenv, homebrew, Linux repositories... They all come with different configurations and lead to problems, such as anaconda being incompatible with pip or Linux default setup not providing pip.

- it's very common to have numerous and diverse python installed on your machine. Different versions, different distributions, sometimes embeded Python... Now it's your job to know which Python you start and how. But people have no idea you should use "py -3.10 -m pip --user install black" and not "pip install black" on Windows if you want a clean install at the system level. Yes, this weird command is the only clean way to do it outside of a venv. I'm not making it up.

- the variation between the different ways to run Python and the commands lead to confusion. "python", "python3.10", "py -3.10", "pip", "pip3", "python -m pip", "sudo pip"... There are many wrong ways to do it, and not many good ones, and it all depends of the distribution, the OS, and wether your are in a venv or not.

So how does that affect packaging? Well, people will get a lot of "command not found", and so they will google and paste commands until one works. There is too much to know, and they want a solution, not learning the whole graph of possibilities. Eventually they do something that seems to work, but they probably installed with admin rights, or in anaconda with pip, or in the wrong Python version. On import or on run, something will then break. Maybe now. Or worse, maybe in one month. And they have no way to understand why.

What's the short term solution for you?

- Always use the official installer on Windows and Mac. Don't use anaconda, don't use homebrew, don't use pyenv.

- If you have to use Anaconda (I would advise not to), don't use virtualenv and pip, use their own tool: conda and the anaconda prompt. You will be limited in packages, but it will work. Don't mix conda and pip.

- On Linux, you will have to figure out what magic packages you need to install to get a full Python setup, because linux distributions authors don't provide a full Python by default. It's different for each distribution and Python version sometimes. It must include pip, venv, setuptools, tkinter, ssl and headers. Good luck.

- Don't use Python outside of a virtualenv. Ever. I know how to do that correctly. You probably don't. So don't. Always create a venv for each project you work on, and activate it before doing anything. From there, you can use pip and Python in peace. E.G: Don't install black or jupyter outside of a venv. You want to use poetry? Don't let it create your venv. Create one, activate it, then install poetry with pip, the use poetry from there. I'm not kidding.

- On Windows, the command is "py -3.10 -m venv name_of_your_venv" to create it, with "-3.10" to be replaced with the version of Python you want. On Mac and Linux, the command is "python3.10 -m venv name_of_your_venv". Yes. Not a joke.

- To activate it, on Windows it will be name_of_your_venv\Scripts\activate", on Mac and Linux, "source name_of_your_venv/bin/activate.sh".

Any variation on this (including using -m or pipx, which I used to recommend but don't anymore) assumes you know what you are doing.

What's the long term solution?

I give some hints here:

https://twitter.com/bitecode_dev/status/1567790837848743937

And meanwhile, take the official survey on packaging, this will help pin things down for the team:

https://www.surveymonkey.co.uk/r/M5XKQCT


👤 james-redwood
https://xkcd.com/1987/

A relevant XKCD comic that encapsulates this well.


👤 Am4TIfIsER0ppos
Because you don't want to use your system's package manager apt, pacman, or whatever.

👤 benced
Node developer here - it’s crazy that the canonical solution embraced by Python developers involves full-on virtual environments.