This is very new behavior in pip. Not so long ago, imagine this:
You `pip install foo` which depends on `bar==1.0`. It installs both of those packages. Now you install `pip install baz` which depends on `bar==2.0`. It installs baz, and updates bar to 2.0. Better hope foo's compatible with the newer version!
I think pip only changed in the last year or two to resolve conflicts, or die noisily explaining why it couldn't be done.
Simple for simple cases - but you update a dependency and that updates a dependency that has a window range of dependencies because one version had a security issue which causes you to downgrade three other packages.
It can get complicated. The resolver in uv is part of its magic.
JavaScript has truly rotted the brains of software developers.
You include the security patch of whatever your dependencies are into your local vetted pypi repository. You control what you consider liabilities and you don't get shocked by breakages in what should be minor versions.
Of course you have to be able to develop software and not just snap Lego's together to manage a setup like that. Which is why uv is so popular.
You can make it a language flame war, but the Python ecosystem has had no problem making this bed for themselves. That's why people are complaining about running other people's projects, not setting up their own.
Sensible defaults would completely sidestep this, that's the popularity of uv. Or you can be an ass to people online to feel superior, which I'm sure really helps.
You're implying that I have to run a local Pypi just to update some dependencies for a project? When other languages somehow manage without that? No way I'm doing that.
Some organizations force you to use their internal dependency repos because the "IT department" or similar has blessed only certain versions in the name of "security" (or at least security theater.)
Inevitably, these versions are out-of-date. Sometimes, they are very, very out of date. "Sorry, I can only install [version from 5 years ago.]" is always great for productivity.
I ran into this recently with a third-party. You'd think a 5 year old version would trigger alarm bells...
Im wondering if people like you are getting paid to vet other people’s libraries? Because with every modern project I have ever seen, you can’t do too much the rest of the day with the amount of library updates you have to be vetting.
Open a requirements.txt and a package.lock.json next to each other and compare. Then you will know the answer to the question what npm, cargo, and others are doing better than pip. Oh, did I sneek a ".lock" in there? Damn right I did.
I remember advocating for running nightly tests on every project/service I worked on because inevitably one night one of the transitive dependencies would update and shit would break. And at least with the nightly test it forced it to break early vs when you needed to do something else like an emergency bug fix and ran into then..
npm did not always do it right, and IMO still does not do it completely right (nor does pnpm, my preferred replacement for npm -- but it has `--frozen-lockfile` at least that forces it to do the right thing) because transitive dependencies can still be updated.
cargo can also update transitive dependencies (you need `--locked` to prevent that).
Ruby's Bundler does not, which is preferred and is the only correct default behaviour. Elixir's mix does not.
I don't know whether uv handles transitive dependencies correctly, but lockfiles should be absolute and strict for reproducible builds. Regardless, uv is an absolute breath of fresh air for this frequent Python tourist.
npm will not upgrade transient dependencies if you have a lockfile. All the `forzen-lockfile` or `npm ci` commands does is prevent upgrades if you have incompatible versions specified inside of `package.json`, which should never happen unless you have manually edited the `package.json` dependencies by hand.
(It also removed all untracked dependencies in node_modules, which you should also never have unless you've done something weird.)