You need a runner for scripts that follow the PEP (actually the packaging standard established initially by the PEP, hence the note about it's historical status.)
The two main runners I am aware of are uv and pipx. (Any compliant runner can be referenced in the shebang to make a script standalone where shebangs are supported.)
As long as your `/usr/bin/env` supports `-S`, yes.
It will install and use distribution packages, to use PyPA's terminology; the term "module" generally refers to a component of an import package. Which is to say: the names you write here must be the names that you would use in a `uv pip install` command, not the names you `import` in the code, although they may align.
linux core utils have supported this since 2018 (coreutils 8.3), amusingly it is the same release that added `cp --reflink`. AFAIK I know you have to opt out by having `POSIX_CORRECT=1` or `POSIX_ME_HARDER=1` or `--pedantic` set in your environment. [1]
If I were to put on my security hat, things like this give me shivers. It's one thing if you control the script and specified the dependencies. For any other use-case, you're trusting the script author to not install python dependencies that could be hiding all manner of defects or malicious intent.
This isn't a knock against UV, but more a criticism of dynamic dependency resolution. I'd feel much better about this if UV had a way to whitelist specific dependencies/dependency versions.
If you’re executing a script from an untrusted source, you should be examining it anyway. If it fails to execute because you haven’t installed the correct dependencies, that’s an inconvenience, not a lucky security benefit. You can write a reverse shell in Python with no dependencies and just a few lines of code.
it's a stretch to "executing a script with a build user" or "from a validated distro immutable package" to "allowing something to download evergreen code and install files everywhere on the system".
I've used Tiger/Saint/Satan/COPS in the distant past. But I think they're somewhat obsoleted by modern packaging and security like apparmor and selinux, not to mention docker and similar isolators.
most people like their distro to vet these things. uv et all had a reason when Python2 and 3 were a mess. i think that time is way behind us. pip is mostly to install libraries, and even that is mostly already done by the distros.
Sorry I was half asleep! Meant that you can easily look at the code in the script and audit what it does – you can just run `cat` in it and you’re done!
But it’s much harder to inspect what the imports are going to do and be sure they’re free of any unsavory behavior.
If that’s your concern you should be auditing the script and the dependencies anyway, whether they’re in a lock file or in the script. It’s just as easy to put malicious stuff in a requirements.txt
There's a completely irrational knee-jerk reaction to curl|sh. Do you trust the source or not? People who gripe about this will think nothing of downloading a tarball and running "make install", or downloading an executable and installing it in /usr/local/bin.
I will happily copy-paste this from any source I trust, for the same reason I'll happily install their software any other way.
It really depends on the use case. A one-off install on a laptop that I don't use for anything that gets close to production - fine by me.
For anything that I want to depend on, I prefer stronger auditability to ease of install. I get it, theoretically you can do the exact same thing with curl/sh as with git download/inspecting dependencies, installing the source and so on. But in reality, I'm lazy (and per another thread, a 70s hippie) and would like to nix any temptation to cut corners in the bud.
I hate that curl $SOMETHING | sh has become normalized. One does not _have_ to blindly pipe something to a shell. It's quite possible to pull the script in a manner that allows examination. That Homebrew also endorses this behaviour doesn't make it any less of a risky abdication of administrative agency.
But then I'm a weirdo that takes personal offense at tools hijacking my rc / PATH, and keep things like homebrew at arm's length, explicitly calling shellenv when I need to use it.
I also recommend the flag for a max release date for $current_date - that basically locks all package versions to that date without a verbose lock file!
(sadly, uv cannot detect the release date of some packages. I'm looking at you, yaml!)
I'm planning to distribute PAPER as a zipapp where the bootstrap script does some initial unpacking to put wheels in a temporary directory, then delegates to the contained code (similar to how the standard library `ensurepip` works). The code that creates the zipapp is in the repository and I'm thinking of genericizing that and making it separately available, too.
Although several variations on this theme already exist, I'm sure. https://github.com/pex-tool/pex/ is arguably one of them, but it's quite a bit bulkier than what I'm looking for.
This isn't really boilerplate; it declares dependencies needed in the environment, which cannot in general be inferred from static analysis of the code.
> Based on the man page it doesn’t appear to be doing anything useful here
The man page tells me:
-S, --split-string=S
process and split S into separate arguments; used to pass multi‐
ple arguments on shebang lines
Without that, the system may try to treat the entirety of "uv run --script" as the program name, and fail to find it. Depending on your env implementation and/or your shell, this may not be needed.
Without -S, `uv run --script` would be treated as a binary name (including spaces) and you will get an error like "env: ‘uv run --script’: No such file or directory".
-S causes the string to be split on spaces and so the arguments are passed correctly.
If I download python project from someone on the same network as me and they have it written in a different python version to me and a requirements.txt I need all those things anyway.
I mean, if you use == constraints instead of >= you can avoid getting different versions, and if you’ve used it (or other things which combined have a superset of the requirements) you might have everything locally in your uv cache, too.
But, yes, python scripts with in-script dependencies plus uv to run them doesn't change dependency distribution, just streamlines use compared to manual setup of a venv per script.