Hacker Newsnew | past | comments | ask | show | jobs | submit | ericfrederich's commentslogin

Yes, it's a mess (New: now with Rust!)


I am totally against Python tooling being written in a language other than Python. I get that C extensions exist and for the most part Python is synonymous with CPython.

I think 2 languages are enough, we don't need a 3rd one that nobody asked for.

I have nothing against Rust. If you want a new tool, go for it. If you want a re-write of an existing tool, go for it. I'm against it creeping into an existing eco-system for no reason.

A popular Python package called Pendulum went over 7 months without support for 3.13. I have to imagine this is because nobody in the Python community knew enough Rust to fix it. Had the native portion of Pendulum been written in C I would have fixed it myself.

https://github.com/python-pendulum/pendulum/issues/844

In my ideal world if someone wanted fast datetimes written in Rust (or any other language other than C) they'd write a proper library suitable for any language to consume over FFI.

So far this Rust stuff has left a bad taste in my mouth and I don't blame the Linux community for being resistant.


I appreciate this perspective, but I think building a tool like uv in Rust is a good idea because it's a tool for managing Python stuff, not a tool to be called from within Python code.

Having your python management tools also be written in python creates a chicken-and-egg situation. Now you have to have a working python install before you can start your python management tool, which you are presumably using because it's superior to managing python stuff any other way. Then you get a bunch of extra complex questions like, what python version and specific executable is this management tool using? Is the actual code you're running using the same or a different one? How about the dependency tree? What's managing the required python packages for the installation that the management tool is running in? How do you know that the code you're running is using its own completely independent package environment? What happens if it isn't, and there's a conflict between a package or version your app needs and what the management tool needs? How do you debug and fix it if any of this stuff isn't actually working quite how you expected?

Having the management tool be a compiled binary you can just download and use, regardless of what language it was written in, blows up all of those tricky questions. Now the tool actually does manage everything about python usage on your system and you don't have to worry about using some separate toolchain to manage the tool itself and whether that tool potentially has any conflicts with the tool you actually wanted to use.


Python is my favorite language, but I have fully embraced uv. It’s so easy, and so fast, that there is nothing else remotely close.

Need modern Python on an ancient server running with EOL’d distro that no one will touch for fear of breaking everything? uv.

Need a dependency or two for a small script, and don’t want to hassle with packaging to share it? uv.

That said, I do somewhat agree with your take on extensions. I have a side project I’ve been working on for some years, which started as pure Python. I used it as a way to teach myself Python’s slow spots, and how to work around them. Then I started writing the more intensive parts in C, and used ctypes to interface. Then I rewrote them using the Python API. I eventually wrote so much of it in C that I asked myself why I didn’t just write all of it in C, to which my answer was “because I’m not good enough at C to trust myself to not blow it up,” so now I’m slowly rewriting it in Rust, mostly to learn Rust. That was a long-winded way to say that I think if your external library functions start eclipsing the core Python code, that’s probably a sign you should write the entire thing in the other language.


> I am totally against Python tooling being written in a language other than Python

I will be out enjoying the sunshine while you are waiting for your Pylint execution to finish


Linting is the new "compiling!"


Linting and type checking are very CPU intensive tasks so I would excuse anyone implementing those types of tools in $LANG where using all CPU juice matters.

I can't help but think uv is fast not because it's written in Rust but because it's a fast reimplementation. Dependency solving in the average Python project is hardly computationally expensive, it's just downloading and unpacking packages with a "global" package cache. I don't see why uv couldn't have been implemented in Python and be 95% as fast.

Edit: Except implementing uv in Python requires shipping a Python interpreter kinda defeating some of it's purpose of being a package manager able to install Python as well.


Nope, this is totally an area where using Rust makes sense and is just _fast_. The fact that Rust has concurrency primitives that are easy to use helps tons too.


I still don't get it, uv is checking if dependencies exist on disk, if they do it creates a link from the cache to your environment, it's a stat syscall and a hardlink syscall in the best of worlds (after solving dependency versions but that should already be done in a lockfile).

Interpreter startup time is hardly significant once in one invocation to set up your environment.

What makes Rust faster for downloading and unpacking dependencies. Considering how slow pip is and how fast uv is (100s of X) it seems naive to attribute it to the language.


You also have to factor in startup time and concurrency. Caching an SAT solvers can't get python to 95% of uv.


>I am totally against Python tooling being written in a language other than Python. I get that C extensions exist and for the most part Python is synonymous with CPython.

>I think 2 languages are enough, we don't need a 3rd one that nobody asked for.

Enough for what? The uv users dont have to deal with that. Most ecosystems use a mix of language for tooling. It's not a detail the user of the tool has to worry about.

>I'm against it creeping into an existing eco-system for no reason.

It's much faster. Because its not written in Python.

The tooling is for the user. The language of the tooling is for the developer of the tooling. These dont need to be the same people.

The important thing is if the tool solves a real problem in the ecosystem (it does). Do people like it?


I, on the other hand, don't care what language the tools are written in.

I do get the sentiment that a user of these tools, being a Python developer could in theory contribute to them.

But, if a tool does its job, I don't care if it's not "in Python". Moreover, I imagine there is a class of problems with the Python environment setup that'd break the tool that could help you fix it if the tool itself is written in Python.


It is well known, and not Python-specific, that using a different language/interpreter for development tools eliminates large classes of bootstrapping complications and conflicts.

If there are two versions of X, it becomes possible to use the wrong one.

If a tool to manage X depends on X, some of the changes that we would like the tool to perform are more difficult, imperfect or practically impossible.


> I think 2 languages are enough, we don't need a 3rd one that nobody asked for.

Look at the number of stars ruff and uv got on github. That's a meteoric rise. So they were validated with ruff, and continued with uv, this we can call "was asked for".

> I'm against it creeping into an existing eco-system for no reason.

It's not no reason. A lot of other things have been tried. It's for big reasons: Good performance, and secondly independence from Python is a feature. When your python managing tool does not depend on Python itself, it simplifies some things.


In theory, I can get behind what your saying, but in practice I just haven't found any package manager written in Python to be as good as uv, and I'm not even talking about speed. uv as I like it could be written in Python, but it hasn't been


I really dig rye, have you tried that?


rye is also written in Rust and it's being replaced by uv.

From its homepage: https://rye.astral.sh/

> If you're getting started with Rye, consider uv, the successor project from the same maintainers.

> While Rye is actively maintained, uv offers a more stable and feature-complete experience, and is the recommended choice for new projects.

> Having trouble migrating? Let us know what's missing.


It's also Rust.


Rust offers a feature-set that neither Python nor C has. If Rust is the right tool for the job, I would rather the code be written in Rust. Support has more to do with incentive structures than implementation language.


What, exactly, is your objection to using rust (or any non-python/C language) for python tooling? You didn't actually give any reasons


I believe he alluded to it here...

"I have to imagine this is because nobody in the Python community knew enough Rust to fix it. Had the native portion of Pendulum been written in C I would have fixed it myself."


Correct. There better be a damn good reason to add another language to the ecosystem other than it's that particular developer's new favorite language.

Is there anything being done in uv that couldn't be done in Python?


How many people are digging into and contributing to any python tooling? How is C meaningfully more accessible than rust? Plenty of people (yet also a significant minority overall) write each of them.

> Is there anything being done in uv that couldn't be done in Python?

Speed, at the very least.

You could just ignore uv and use whatever you want...


> How is C meaningfully more accessible than rust

In an ecosystem where the primary implementation of the language is in C and nearly all native extensions are written in C do you really not know the answer to that?


> How is C meaningfully more accessible than rust

They've been teaching C in universities for like 40 years to every Computer Science and Engineering student. The number of professionally trained developers who know C compared to Rust is not even close. (And a lot of us are writing Python because it's easy and productive, not because we don't know other languages.)


If c + Python is so wonderful and so ubiquitous, why hasn't someone already created uv in C?

Ps the government and others have all recommended moving from C/C++ to Rust... It's irrelevant whether or not that's well-founded - it simply is.

And plenty of other cli tools have been successfully and popularly ported to Rust.


I think you'll find that C (and C++) are rapidly disappearing from computer science curriculums. Maybe you'll encounter one or both in Operating Systems, or an elective, but you'll be hard pressed to find recent graduates actually looking for work in those languages.


To quote Movie Mark Zuckerberg from The Social Network:

> If Python developers were the inventors of uv - they'd have invented uv


Well to be fair I think they did, it's a successor to Rye which was built by the guy who made Flask, in rust, and inspired by how cargo works.


Hmm, maybe. Though, IIRC rye and uv were more parallel developments rather than uv's lineage tracing back to rye. Also at the point mitsuhiko created rye, he had handed off maintenance of Flask for ~8 years already and was arguably more associated with efforts in the Rust community than in Python.

However, in both cases (uv and rye) it took someone with a Rust background to build something to actually shake up the status quo. With the core PyPa people mostly building on incremental improvements in pip, and Poetry essentially ignoring most PEP effort, things weren't really going to go anywhere.


speed


I don't see any meaningful speedup. The 10x claims are not reproducible. He's also comparing it to the much older style of requirements.txt projects and not a poetry project with a lockfile.

I detailed this in another comment but pip (via requirements.txt): 8.1s, poetry: 3.7s, uv: 2.1s.

Not even 10x against pip and certainly not against poetry.


You must be holding it wrong, because everyone else raves about uv


Usually uv pip is only about x2 as fast as regular pip for me. Occasionally I'll have some combination of dependencies that will cause pip to take 2-5 minutes to resolve that uv will handle in 10-20 seconds.


They said "no meaningful speedup". 2x is meaningful


The impact of a 2x speedup is relative. For a quick test on one of my projects it's 10 seconds with pip and 4 seconds with uv. That's roughly in line with my previous testing. It's a nice minor speedup on average. It really shines when pip does some non-optimal resolving in the background that takes a minute or more.


How complex are the requirements for this project?



I see. I encourage you to try it with larger projects and see if it makes a difference.

That said, the speed is only one reason I use it. I find its ergonomics are the best in the Python tools I’ve tried. For example it has better dependency resolution than poetry in my estimation, and you can use the uv run —-with command to try things before adding them to your environment.


you say "I'm against it creeping into an existing eco-system for no reason.", while you ignore that there is at least one good reason: A lot better performance.


The 10x performance wasn't mentioned in the article at all except the title.

I watched the video and he does mention it going from 30s to 3s when switching from a requirements.txt approach to a uv based approach. No comparison was done against poetry.

I am unable to reproduce these results.

I just copied his dependencies from the pyproject.toml file into a new poetry project. I ran `poetry install` from within Docker (to avoid using my local cache) `docker run --rm -it -v `pwd`:/work python:3.13 /bin/bash` and it took 3.7s

I did the same with an empty repo and a requirements.txt file and it took 8.1s.

I also did through `uv` and it took 2.1s.

Better performance?, sure. A lot better performence?, I can't say that with the numbers I got. 10x performance?... absolutely not.

Also, this isn't a major part of anybody's workflow. Docker builds happen typically on release. Maybe when running tests during CI/CD after the majority of work has been done locally.


I personally don’t care about the performance:

https://news.ycombinator.com/item?id=44359183

I agree it would be better if it was in Python but pypa did not step up, for decades! On the other hand, it is not powershell or ruby, it is a single deployed executable that works. I find that acceptable if not perfect.


Better performance than C? This is news to me


There are cases where single-threaded Rust and C are faster than each other, though usually only by single-digit percentages. But Rust is so much easier to parallelize than C that it isn't even funny.


According to the very link you provide, the sticking point was a dependency which does not use rust, and the maintainer probably being busy.

I updated a rust-implemented wheel to 3.13 compat myself and literally all that required was bumping pyo3 (which added support back in June) and adding the classifier. Afaik cryptography had no trouble either, iirc what they had to wait on was a 3.13 compatible cffi .


The PR which enabled 3.13 did have changes to Rust code.

https://github.com/python-pendulum/pendulum/pull/871


Because they did more than just support 3.13:

> I'm sure some of the changes are going too far. We are open to revert them if there's an interest from maintainers to merge this PR :)

Notably they bumped the bindings (“O3”) for better architecture coverage, and that required some renaming as 0.23 completed an API migration.


>I am totally against Python tooling being written in a language other than Python.

Cool story bro.

I'm totally against Python tooling being in dismal dissaray for 30 years I've been using the language, and if it takes some Rust projects to improve upon it, I'm all for it.

I also not rather have the chicken-and-egg dependency issue with Python tooling written in Python.

>A popular Python package called Pendulum went over 7 months without support for 3.13. I have to imagine this is because nobody in the Python community knew enough Rust to fix it. Had the native portion of Pendulum been written in C I would have fixed it myself.

Somehow the availability and wide knowledge of C didn't make anyone bother writing a datetime management lib in C and making it as popular. It took those Pendulum Rust coders.

And you could of course use pytz or dateutil or some other, but, no, you wanted to use the Rust-Python lib.

Well, when you start the project yourself, you get to decide what language it would be in.


Upvoting for interesting/important/sympathetic perspective, but am very much in disagreement


Offtopic, but thank you. I really wish this way of treating up/downvotes was more widespread. Down should mean it doesn't contribute to the conversation, not that you disagree with their opinion.


> I'm against it creeping into an existing eco-system for no reason.

There is a reason: tools that exist today are awful and unusable if you ever wrote anything other than python.

: I'm saying it because the only way I can see someone not realizing it is that they have never seen anything better.

Okay, maybe C and C++ have even worse tooling in some areas, but python is still the top language of having the worst tooling.


I love rust but I tend to agree, python tooling should be maintainable by the community without learning a new language.

However rust is a thousand times faster than python.

At the end, if you don't like it don't use it.


I'm wondering why folks aren't moving wholesale from Python to Rust, seems like it would be better for everyone.


Because rust is a lot harder to experiment with and really does not work for interactive or notebook stuff. Python also has a massive ecosystem of existing libraries.

And thus rust is used to either make tools, or build libraries (de novo or out of rust libraries), which plays to both strengths.


It would be a wholesale move from one of the easiest programming languages to start on, to one of the hardest languages to start on.

Most programmers I've met were beginners, and they need something easier to work with until they can juggle harder concepts easily.


I had a situation, admittedly niche, where some git based package dependency wasn't being updated properly (tags vs. commit hashes) and thanks to poetry being written in Python I was able to quickly debug and solve the problem. I think it's more a matter of core functionality (that affects everyone) vs. more esoteric or particular use cases (like dataframe libraries) that make sense to FFI.


Did you even read the issue that you pointed to? It's not even the rust part that was the issue.


Or maybe the community will embrace Rust as it is implemented... There's no reason to think because you or the current gen of Python devs are focused on C then the next gen or further will too.


I understand this sentiment. Part of it was people trying to build up their cv for Rust. On the other hand, some tools/libraries in Python were old. Take pandas for example, it was not good for modern use. We desperately needed something like polars and even that is being outpaced by current trends.


Curious what you see as outpacing polars, hybrid analytical/streaming query engines?


They use hexchat as an example but do these processes run with the users configuration? Wouldn't this leak IRC usernames if you forget to change it. ... Or leak cookies if you launch a browser?


Separation of concerns - although Tor goes to great lengths to prevent fingerprinting, Tor and Oniux’s main aim IMHO is to make the source IP untraceable.

Same thing could have been said about using Tor to login to Gmail (if it were not HTTPS).


What do you mean by leak usernames? It would leaks that a username uses tor. It would still leak that all of the usernames connecting to the same IRC host would be the same person.

IRC seems pretty dangerous if you want to remaining anonymous considering how many people are logging disconnection times allowing them to be correlated with other network disruption events.


Tor is anonymizing you primarily from the network. There are many use cases where you do want to be authenticated/known to whoever you are talking to. You just want observers to not know.

In your example of correlation of connection times, it may not be your goal to remain anonymous from the network and its participants, you may be interested in the location-hiding properties, and/or adversarial networks (like local government or corporate networks) and firewalls.


Irssi iirc used to default your username to your system username, so noobs would leak their given name by accident. After seeing that I changed my username in Linux to always be the most common username


I was talking more about you using HexChat with your preferred username "FooBar", but then when on Tor you want to be "SpamEggs". If you launch HexChat through oniux and it reads your config file, you might hit the login button before changing your name from FooBar to SpamEggs.


What is the most common Linux username though? Obviously you don’t want to do your regular work as root. And guest has its own issues.

Is there a “common name”?


Not sure about "most common", but I have some vms that use `user` as the username.


Robert'); DROP TABLE Students;-- Roberts


ubuntu


admin


root


root?


I personally hate how all these platforms like GitHub, GitLab, BitBucket, etc slapped a centralized relational database to manage issues, comments, merge requests, etc next to a distributed de-centralized system like Git.

I especially hate how they've integrated CI/CD into the Git platforms.

I loathe the fact that Microsoft has tied their AI to their Git platform.

I want my CI/CD to be agnostic. I want my AI to be agnostic. I want my issues, MRs, comments, etc to be decentralized and come along for the ride when I clone a repo.


The local first approach to dev tools and ecosystems does seem to be on the way out.

The pressures for this aren’t even explicitly corporate interest anymore, a lot of it is driven by non-software-experts who are kind of forced to participate in software dev (e.g. your friendly data science colleague who used to be, say, in material science or astrophysics), which is completely understandable. But more concerneing is the trend of actual software engineers who dislike consoles, terminal programs, and basically don’t believe much in understanding their tools.

You see this all the time with basic stuff from git UIs to kubernetes in IDEs. Productivity isn’t really the issue, although it’s always mentioned as an excuse, there’s just a big appeal to reducing any/every cognitive load no matter what the practical cost is for losing understanding/fluency. To give people the benefit of the doubt though, maybe this pressure is ultimately still corporate though and it started with the call for “full stack” devs, continued with devops/platform engineering etc, where specialists are often actively discouraged. Laziness and a higher tolerance for ignoring details may be a necessary virtue if the market forces everyone to be a generalist.



4k gaming is dumb. I watched a LTT video that came out today where Linus said he primarily uses gaming monitors and doesn't mess with 4k.


No it's not. 2560x1440 has terrible PPI on larger screens. Either way with a 4k monitor you don't technically need to game at 4k as most intensive games offer DLSS anyway.


What matters is the PPD, not the PPI, otherwise it's an unsound comparison.


Too much personal preference with PPD. When I upgraded to a 32" monitor from a 27" one i didn't push my display through my wall, it sat in the same position.


Not entirely clear on what you mean, but if you refuse to reposition your display or yourself after hopping between diagonal sizes and resolutions, I'd say it's a bit disingenuous to blame or praise either afterwards. Considering you seem to know what PPD is, I think you should be able to appreciate the how and why.


And FSR, which is cross gpu vendor.


Not anymore. FSR4 is AMD only, and only the new RDNA4 GPUs.


I have seen AMD's PR materials for RDNA4, and as far as I can tell, they do not say anywhere anything like that.

People read too much into "designed for RDNA4".


https://cdn.videocardz.com/1/2025/01/AMD-FSR4-9070.jpg

Why would they write that on their marketing slides?


Because it only works on these cards right now.

Further elaborated by their GPU marketing people on interviews. To summarize "RDNA4 for now" and "we're looking into supporting older...".


Yep. I have both 4k and 1440p monitors and I can’t tell the difference in quality so I always use the latter for better frames. I use the 4k for reading text though, it’s noticeably better.


That's why I also finally went from 1920x1200 to 4k about half a year ago. It was mostly for reading text and programming, not gaming.

I can tell the difference in games if I go looking for it, but in the middle of a tense shootout I honestly don't notice that I have double the DPI.


There are good 4K gaming monitors, but they start at over $1200 and if you don't also have a 4090 tier rig, you won’t be able to get full FPS out of AAA games at 4k.


I still have a 3080 and game at 4K/120Hz. Most AAA games that I try can pull 60-90Hz at ~4K if DLSS is available.


Most numbers people are touting are from "Ultra everything benchmarks", lowering the settings + DLLS makes 4k perfectly playable.


I've seen analysis showing that DLSS might actually yield a higher quality image than barebones for the same graphics settings owing to the additional data provided by motion vectors. This plus the 2x speedup makes it a no brainer in my book.


Also, ultrawide monitors. They exist, provide more immersion. And typical resolution is 3440x1440 which is high and and the same time have low ppi (basically regular 27" 1440p monitor with extra width). Doubling that is way outside modern GPU capabilities


A coworker who is really into flight sims runs 6 ultrawide curved monitors to get over 180 degrees around his head.

I have to admit with the display wrapping around into peripheral vision, it is very immersive.


Almost no one plays on native 4k anyway. DLSS Quality (no framegen etc) renders at 1440p internally and by all accounts there is no drawback at all, especially above 60fps. Looks great, no noticeable (excluding super sweaty esports titles) lag and 30% more performance. Combined with VRR displays, I would say 4k is perfectly ok for gaming.


Taking anything Linus or LTT says seriously is even dumber....


I watched the same video you talking about [1], where he's trying the PG27UCDM (new 27" 4K 240Hz OLED "gaming monitor" [2]) and his first impressions are "it's so clean and sharp", then he starts Doom Eternal and after a few seconds he says "It's insane [...] It looks perfect".

[1] https://www.youtube.com/watch?v=iQ404RCyqhk

[2] https://rog.asus.com/monitors/27-to-31-5-inches/rog-swift-ol...


Nonsense 4k gaming was inevitable as soon as 4k TVs got mainstream.


Today someone's pipeline broke because they were using python:3 from Dockerhub and got an unexpected upgrade ;-)

Specifically, pendulum hasn't released a wheel yet for 3.13 so it tried to build from source but it uses Rust and the Python docker image obviously doesn't have Rust installed.


Wow, that's crazy. I tried a 6 digit hash and got a 404, then I tried another 6 digit hash and got "This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository."

Insane


> 1) Fork the repo. 2) Hard-code an API key into an example file. 3) <Do Work> 4) Delete the fork.

... yeah if <Do Work> is push your keys to GitHub.


R2 in only "free" until it isn't. Cloudflare hasn't got a lot of good press recently. Not something I'd wanna build my business around.


Aside from the casino story (high value target that likely faces tons of attacks, therefore an expensive customer for CF), did something happen with them? I'm not aware of bad press around them in general


R2 egress is free.


Why Rust? Aren't you alienating Python devs from working on it?

I see that UV is bragging about being 10-100x faster than pip. In my experience the time spent in dependency resolution is dwarfed by the time making web requests and downloading packages.

Also, this isn't something that runs every time you run a Python script. It's ran once during installation of a Python package.


I actually think that Python's tooling should not be written in Python. Because if yes, you end up with at least two version of Python, one to run the tooling, one to run the project.


I'm not sure of the answer, but one thing Rust has obviously bought them is native binaries for Mac/Windows/Linux. For a project that purports to be about simplicity, it's very important to have an onboarding process that doesn't replicate the problems of the Python ecosystem.


If you are building a production app that uses python in a containerized way, you may find yourself rebuilding the containers (and reinstalling packages) multiple times per day. For us, this was often the slowest part of rebuilds. UV has dramatically sped it up.


Uv has already proven itself by being faster at every step it seems like, except maybe downloading. But notably it includes unpacking and/or copying files from cache into the new virtualenv, which is very fast.


It parallelizes downloads and checking of the packages.

It also doesn't compile .py files to .pyc at install time by default, but that just defers the cost to first import.


It runs every time you build a docker image or build something in your CI


so it take 3 seconds to run instead of 0.3? Don't get me wrong, that's a huge improvement but in my opinion not worth switching languages over

Features should be developed and tested locally before any code is pushed to a CI system. Dependency resolution should happen once while the container is being built. Containers themselves shouldn't be installing anything on the fly it should be baked in exactly once per build.


Modern CI can also cache these dependency steps, through the BuildKit based tools (like Buildx/Dagger) and/or the CI itself (like GHA @cache)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: