Hacker Newsnew | past | comments | ask | show | jobs | submit | physicsguy's commentslogin

Rust is terrible for pulling in hundreds of dependencies though. Add tokio as a dependency and you'll get well over 100 packages added to your project.

Even side stepping that tokio no longer pulls multiple packages, it used to be split into multiple packages, in the same way that KDE in Rust would be hundreds of packages.

Rust projects tend to take their project and split it into many smaller packages, for ease of development, faster compiles through parallelization, ensuring proper splitting of concerns, and allowing code reuse by others. But the packages are equivalent to a single big package. The people that write it are the same. They get developed in tandem and published at the same time. You can take a look at the del tree for ripgrep, and the split of different parts of that app allows me to reuse the regex engine without dealing with APIs that only make sense in the context of a CLI app or pulling in code I won't ever use (which might be hiding an exploit too).

Counting 100 100 line long crates all by the same authors as inherently more dangerous than 1 10000 line long crate makes no sense to me.


It's worth noting that Rust packages (crates) are all single compilation units, and every compilation unit is a package. It's the equivalent of complaining that OpenSSL pulls in hundreds of `.c` files.

pin-project-lite is the only base dependency, which itself has no dependencies. If you enable the "full" feature, ie all optional doodads turned on (which you likely don't need), it's 17: bytes, cfg-if, errno, libc, mio, parking_lot+parking_lot_core+lock_api, pin-project-lite, proc_macro2+quote+syn+unicode-ident, scopeguard, signal-hook-registry, smallvec, and socket2. You let me know which ones you think are bloat that it should reimplement or bind to a C library about, and without the blatant fabrication this time.

We did it in an engineering setting and had very mixed results. Big 800 page machine manuals are hard to contextualise.

I worked in an industry for five years and I could feasibly build a competitor product that I think would solve a lot of the problems we had before, and which it would be difficult to pivot the existing ones into. But ultimately, I could have done that before, it just brings the time to build down, and it does nothing for the difficult part which is convincing customers to take a chance on you, sales and marketing, etc. - it takes a certain type of person to go and start a business.

Nobody’s talking about starting businesses. The article is specifically about pypi packages, which don’t require any sales and marketing. And there’s still no noticeable uptick in package creation or updates.

My understanding reading it was that PyPi packages is just being used as a proxy variable

Yes, you are correct. The parent is not following the conversation. They probably didn't even read the article.

They had some great video series too which seem to have stopped. Their War Stories gaming interviews were brilliant.


In my PhD more than a decade ago, I ended up using png image file sizes to classify different output states from simulations of a system under different conditions. Because of the compressions, homogenous states led to much smaller file size than the heterogenous states. It was super super reliable.


I'm not sure why this is against 'frameworks' per se; if we were sure that the code LLMs could generate was the best possible, we might as well use Assembly, no, since that'd lead to best performance? But we don't generally, we still need to validate, verify and read it. And in, that, there is still some value in using a framework since the code generated is likely, on the whole, to be shorter and simpler than that not using a framework. On top of that, because it's simpler, I've at least found that there's less scope for LLMs to go off and do something strange.


> Costs for all features are shared by all users.

To a degree but most enterprise focused software usually has differential pricing. Often that pricing isn't public so different companies get different quotes.


The other thing is bringing in the knowledge about what other customers in the same field want. For business-focused software this can be a boon, customers often can't really envision the solution to their problem, it's like the Henry Ford attributed "If I had asked people what they wanted, they would have said faster horses"


Until a given company decides they need access control for their contractors that's different from their employees, etc. etc. etc. - seen it all before with internal often data scientist written applications that they then try to scale out and run into the security nightmare and lack of support internally for developing and taking forward. Usually these things fizzle out when someone leaves and it stops working.


Bingo; the exact same arguments against regular-coding it in-house apply to vibe-coding it in-house.


I just don't buy it.

Most people who've been in a business SaaS environment know that writing the software is relatively the easy part aside from in very difficult technical domains. The sales cycle + renewals and solution engineering for businesses is the majority of the work, and that's going nowhere.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: