Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Make sure you don't have anything else relying on Python because they will probably never work again.

This is why when I see some clever open source tool discussed on HN and I go to the repo and see it's written in Python I close the browser window and pretend I never saw it.

Yes I know there are ways to protect yourself when using Python in much the same way that lead-lined glove boxes protect you when working with plutonium, but I can never remember the proper CLI incantation to make the lead-lined glove box appear.



These kinds of histrionics are really uncalled for. Virtual environments are easy to work with. https://chriswarrick.com/blog/2018/09/04/python-virtual-envi... is a solid tutorial.


Virtual environments are an incomplete solution at best. In particular, they really don't help much with the use case of wanting to install a tool: if you're installing a tool and not just setting up a development environment for working on a specific project with its dependencies, then you probably want to make that tool usable without activating its venv. The virtual environment capabilities shipped with Python itself don't really have any affordances for that.


> then you probably want to make that tool usable without activating its venv. The virtual environment capabilities shipped with Python itself don't really have any affordances for that.

The only reason it isn't "usable" is because the wrapper script isn't on your system's path. Unless your tools actually depend on venv-related environment variables being set (for example, because they make subprocess calls and assume that `'python'` will use the currently running Python when they should correctly be using `sys.executable`), which IMX is very rare, you don't ever actually have to activate a venv to do anything with it. In particular, you don't need to activate it to use its REPL; you can instead directly specify the path to its Python executable.

The missing affordance basically looks like `ln -s path/to/venv/bin/tool ~/.local/bin`. Which is a big part of what Pipx does for you. (And once I realized that, I started using Pipx's vendored shared copy of pip: https://zahlman.github.io/posts/2025/01/07/python-packaging-...)


uvx / uv tool works great for that.

You can `uv tool install your_package`, add a dir to your PATH, and then you can launch the tool appropriately, with it installed in its own venv


  $ virtualenv ~/venv/yt-dlp
  $ . ~/venv/yt-dlp/bin/activate
  $ pip install yt-dlp
  $ ln -s ~/venv/yt-dlp/bin/yt-dlp ~/bin/yt-dlp
  $ deactivate
  $ yt-dlp ...


Isn't that really obviously about five steps too many to be considered a reasonable way of installing a package?

(And you didn't handle getting an appropriate version of python installed.)


> Isn't that really obviously about five steps too many to be considered a reasonable way of installing a package?

That's why I use pipx. But the activation and deactivation in that example are completely unnecessary, and the last line is just using the installed package. Installation actually looks like:

  $ python -m venv ~/venv/yt-dlp
  $ ~/venv/yt-dlp/bin/pip install yt-dlp
  $ ln -s ~/venv/yt-dlp/bin/yt-dlp ~/bin/yt-dlp
which is only two steps too many if we acknowledge that there has to be at least one step. This all of course can also just be put in a shell script, or a function in your ~/.bashrc.

Pip just happens to be laser focused on the actual installation of packages, so it doesn't provide that wrapper. (With PAPER I'm aiming for slightly broader scope, but still something intended specifically for users that's only one piece of a developer toolchain.)

> (And you didn't handle getting an appropriate version of python installed.)

When was the last time you tried to obtain an application that could be installed as a PyPI package and your system Python wasn't compatible? Everyone knows the CPython release cadence now and is strongly pressured to advance their own support in lock-step with that. Similarly for libraries. There's already a full set of wheels for the latest version of NumPy for 3.14, 22 of them. Even musl-based distributions are provided for. Even 32-bit Windows is provided for, for those few holdouts.

If your target audience doesn't have Python and understand on a basic level what that is, why will they be able to understand using the uv command line and accept using that to install a program?


I don't get it. Then you just install the tool outside of venv? then it's installed for your user account.


But then all the tool's dependencies have to play nice with the dependencies of all your other unrelated Python-based tools.

One thing you can do (I'm not saying it's user friendly) is set up the tool in a virtualenv and then set up an alias like

    alias foo-tool=/home/blah blah/tools/something/env/bin/python -m foo-tool


Or you can make a symlink from your PATH, which is how Pipx does it.


But this is true for anything that isn't statically linked?


Why would static linking be necessary? The virtual environment contains all the needed dependencies, and isolates them from everything else.


ah, right!


That requires you to be running the right version of python at the system level, and for all your installed tools to have compatible package versions. It doesn't work very often for me


What sorts of things are you installing that you "often" need to care about having the "right" version of Python? It's normal in the Python ecosystem that packages explicitly support every non-EOL Python version. Numpy already has wheels out for 3.14. And even if you use an older Python, the older compatible Numpy versions don't go away.

Can you give a concrete example of an installation you attempted that failed?


It's rarely the python version itself that is the problem (provided everything supports 3.9+ or similarly recent versions).

The package versions are however fraught - our machine learning codebase at work only was stuck on the 1.x versions of numpy for more than a year, as scipy and ultralytics both took forever to upgrade, and that prevented us adopting packages that depend on the 2.x numpy.


The entire point of the virtual environments is that you can have both on your system, as needed, for separate projects.

But the language isn't designed to support multiple versions of the same library in the same runtime environment. If you get them both to load (and there are a lot of ways), they are different libraries (actual different module objects) with different names (or the same name in different namespaces). Packaging tools can't do anything meaningful about this. I write about this on HN more often than I'd like; see e.g. https://news.ycombinator.com/item?id=45467514 . It's also covered in https://zahlman.github.io/posts/2024/12/24/python-packaging-... .


Bookmarking this.


Don't, use `uv`. Literally just install uv and then try something like `uvx bakeit` and it'll download and run the tool in its own virtualenv. You don't need to bother with anything.


Everybody else uses virtual environments and alternate installations of python instead of using and installing packages in the system python installation. It is not that hard.


That is the incantation.


I memorized it quickly enough from some time experimenting with cuda/ai tools.

python -m venv .

. bin/activate

pip install -r requirements.txt


`uvx <tool name>` and you're done. You don't even need to install the tool first.


The ephemeral environment that uv creates is as much an "installation" as the permanent one doing it manually.


The fact that you called one "ephemeral" and the other "permanent" makes me think you contradicted your own point.


No, I just don't think the word "install" has the same connotations that you apparently think it does.


These days, if I'm feeling generous I'll spend a minute or two to see if I can get a promising Python tool to install with uv. If it's not going to easily submit to a `uv tool install`, then I move on and forget about it.


UV has gone a long way to fix that issue with python.


uv has not really done that much. It's all been possible, and usually about as ergonomically. It's just opinionated in a way that people currently seem to like, and fast primarily due to good internal design (not because it's written in rocket emoji Rust sparkle emoji, although that certainly is a net positive to performance).


UV hasn't done anything except for all the parts that matter. (And while there are compelling arguments that Rust has nothing to do with it, the correlation is pretty strong)


UV has provided easy solutions for engineers that are easily frustrated by a lack of easy solutions.

There's nothing wrong with easy solutions, but any python engineer worth their salt has long since solved and moved on from the issues that UV claims to solve.

I believe UV provides value, even significant value, to lite python users, but for those working with python day in and day out, maybe you're using a new tool, but life has not changed significantly for the better. Or you just sucked and didn't reach for any of the perfectly usable solutions to all of these problems that existed before UV showed up.


uv provides a breadth of functionality that no single tool has before, and that no simple, easy combination of tools has before. For developers, it replaces poetry with something much faster and more reliable, and replaces pipenv. For end users, it replaces pipx and pyenv, and pretty much replaces pip itself (which no longer wants to be used by end users). And most importantly, uv relieves you of having to remember which of the preceding tools are suitable for which use cases.


I do have a blog post planned on the topic that's hopefully only a few posts away, and a renewed commitment to start frequent blogging again. And this has all been off topic, so I'll spare the reply.


uv has done for Python what Docker did for containers. Could you accomplish what Docker did with OS primitives like cgroups? Of course. Do most devs know what those are, much less how to use them? Doubtful.

I consider myself to be fairly good with Python, and excellent with *nix. uv is far and away the best tool for managing Python projects, tools, and scripts, bar none. Sorry, Poetry - you had a good run.


Any project written in a language with big user base java, C#, C++, perl, rust even fortran has this problem. The only thing that helps is experience with the language. I very seldom see code that survives ten years, even no deps things fail because your compiler interpreter changes.

It is just part of the job. Sure I am not a big fan of C# or PowerShell but a big part is just that I have no experience.


That’s not generally true for .Net, though the use of third party libraries could create an issue in some cases.

.Net was designed deliberately so that multiple versions could be installed side by side and an executable would pick the version most likely to work based on target version and compatibility. In most cases .Net is also forward compatible so e.g. a .Net 3 app continues to work on a PC where only .Net 4.8 is installed. In addition, libraries could be part of the application installation and in modern .Net, the framework can be part of the application installation.

In most cases, everything will just work, and when it doesn’t, one can just install the older .Net version needed and nothing will be broken.


What I mean with "no experience" is that I have never fixed issues with them myself. The amount of hours we were billed internally for fixing a broken .Net app tells me it was not that easy though.

My point is that if you do not know your platform things get complicated, either for yourself or for people that take over after you. For me personally python does not mark it self as extra hard to handle.


Nope. Python has a habit of regularly breaking working stuff.

It's a developer/community problem - the language itself doesn't require it.


I have never been a Python fan, but I wrote my first python application in Zope 26 years ago it still runs. Since then I have touched other things that are much worse not all of them on the list above. My point is that while python may break, other stuff breaks as well.

There are a lot of prototype integration projects in Python though and that is incredibly hard to get running. I know nothing about C# but as Linus Torvalds said the binary compatibility layer for Linux is Win32 because of Steam, so MS has worked on the problem before. Though I would like to point out that when you are talking about prototype integration projects Windows is as bad as Python, I have rebuilt test environment in Windows that had been left to rot for ten years.

So in my work life experience all programming languages suck at this, python might suck less or more but it is more of a skill issues than anything.


Is Python still that bad? I remember the big problems were during the Python 2 -> Python 3 transition, but in the last few years I've managed to get away with a single Python install and haven't really had any compatibility issues.

I stick with Python.org packages for macOS, and the official Python packages on Ubuntu, and everything seems to work just fine.


There are better tools for managing the madness, certainly. uv makes python management almost pleasant (sandboxing the whole environment by default is a wise choice).


That sounds like a good way to greatly increase the disk and ram required for each tool or running copy of a tool or application. Almost as bad as turning everything into an Electron app.


You aren't wrong, but it's pretty much the only sane way to deal with an ecosystem that doesn't allow multiple versions of the same package to coexist - otherwise we're all stuck on the lowest common denominator of each configuration of packages


It may sound that way to you, but it doesn't have that problem for me.


Ehm.. surely there are ecosystems making Python brilliant shiny in comparison.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: