Maybe it’s because the frameworks themselves are less important than the *human language used to describe them. To someone who knows what a Genserver is, or has used one before, what does it matter which of the many frameworks they’re using?
It just seems like we reinvent the same things again and again. Maybe that’s because at this level you can’t have a standard and things need to be bespoke.
It could, on the other hand, be an indication that there is a huge amount of wasted effort in revisiting long ago solved problems.
I agree, I just think it’s part of the evolutionary process.
We reinvent them because the core concepts are right, but there isn’t enough connective tissue to re-use the implementations directly.
I don’t see it as a bad thing. The concepts persist and gain momentum and with each iteration we get closer to something that can actually be relied on consistently without replication.
There’s wasted effort, but something about the original was missing that would have made it the obvious choice. I think it feels wasted mostly because it isn’t yet obvious what the missing pieces were.
I just don’t think this is true. Erlang has existed since the early 80s, open source in the late 90s. Joe Armstrong wrote his thesis in 2003. It’s been 2 decades, and yet reliable distributed systems are still something companies fail at consistently. The latest trend with kubernetes and web interfaces everywhere is, in my opinion, an example of these ideas never gaining popularity so we outsourced it to infrastructure (then realised it needed code and made a whole new class of network/developer, devops).
Erlang basically posited that “process isolation” as a building block was fundamental to reliable distributed systems in the presence of software errors. I don’t think that’s ever been really challenged, but the final solution we have today seems almost ludicrously inefficient in comparison. I just hope lunatic gets traction!