Hacker Newsnew | past | comments | ask | show | jobs | submit | swombat's commentslogin

Bitcoin's success is extremely easy to understand.

Socially:

Some people don't trust governments to handle one of the most powerful collaboration technologies ever invented (money). All financial systems before Bitcoin were government controlled. Some have behaved in a trustworthy way, many have not. And over the longer term they all tend to mess it up eventually.

So these people set out to build an alternative that they believed governments couldn't control.

Technically:

The interesting key advance that made Bitcoin interesting and successful was coming up with an algorithm that solved the problem of getting parties that don't trust each other at all, to collaborate on maintaining a global ledger to everyone's benefits, without them having to even know about each other.

This is already a feature of money (I don't need to know about you to have indirect financial ties to you) but was not true of the financial system itself until Bitcoin.


> All financial systems before Bitcoin were government controlled

Company scrip, community currency, hawala have all existed for centuries.

Also Bitcoin is also government controlled. It lacks the anonymity required to protect participants making it trivial for nation states to influence. And it does nothing to prevent the centralisation of capital that causes so much manipulation in traditional currency systems.

> And over the longer term they all tend to mess it up eventually.

Over the long term the probability of failure of all systems is 1.


> Company scrip, community currency, hawala have all existed for centuries.

All of these have the problem of centralized and permissioned issuance, where one entity can arbitrarily inflate the supply without the knowledge or consent of others.

> Also Bitcoin is also government controlled

In what way?

> It lacks the anonymity required to protect participants

This is false and does not make it government controlled. I’ll concede that there are many ways for one to lose privacy when using Bitcoin though.

> And it does nothing to prevent the centralisation of capital

The ‘centralization of capital’ isn’t an issue Bitcoin aims to solve. One of the big problems Bitcoin solves is the unjust accumulation of capital via arbitrary issuance (IORB, RRP, loans via newly created bank deposits, etc.)


I wonder what it says about me that the only counterexample that came to me is the use of Nuka Cola bottle caps in the Fallout games. But that only works in a post-apocalyotic setting, when there's no longer any manufacturing capability.


That explains why Something like bitcoin might emerge but not specifically bitcoin. Presumably first mover advantage etc.


Feels like a misleading headline. The author created another templating language alternative to ERB, found that it was slower than ERB, then optimise it until it was roundabout as fast as ERB given a relatively simple template.

The author then appears to draw the wrong conclusion:

> What I find most interesting about the changes I’ve made to code generation in P2, is that the currently compiled code is more than twice as fast as it was when P2 first came out, which just goes to show than in fact Ruby is not slow, it is actually quite fast, you just need to know how to write fast code! (And I guess this is true for any programming language.)

I love Ruby, but it is still a slow language on most benchmarks. That's ok. For most webapps, the bottleneck is not execution-time performance, it's the speed and joy of writing the code. Functionality that never got built because it was too annoying to build is infinitely slow. But there's no point in pretending Ruby, compared to, say, Rust, isn't a slow-as-molasses execution environment. It is. It's ok. It's optimised for developer happiness, not speed.

And yes, even so, you can write slow Ruby code and fast Ruby code. Again, which one makes sense is contextual. But it doesn't make the point that "Ruby isn't slow."


Hi, author here. Taking your argument to its logical conclusion we can say that it doesn't matter if your Ruby code is slower or faster because it's Ruby, we know it's "slow-as-molasses", we only care about developer happiness, and anyways we're I/O-bound, so it doesn't really matter how our code performs...

But in my experience it does. I've built platforms in Ruby that handle north of 1K reqs/sec with bursts of 10K reqs/sec on moderate hardware, without needing to setup a whole cluster of machines that crunch on poorly-performing code.

From my experience, getting the average execution time per render from say 0.1ms to 0.01ms, and especially reducing allocations and GC pressure has a big effect on 99% percentiles, and consequently on the cost of compute.

Saying because we use Ruby we don't care if it's slow or not is in a way dismissing it as a viable platform for writing reliable software (because performance is part of reliability).

To me, you can use Ruby to create systems that have very good performance characteristics, and still benefit from developer happiness. The two are not contradictory.


> Taking your argument to its logical conclusion we can say that it doesn't matter if your Ruby code is slower or faster because it's Ruby, we know it's "slow-as-molasses", we only care about developer happiness, and anyways we're I/O-bound, so it doesn't really matter how our code performs...

Not OP, but to a point I think this is pretty much true...

We currently have decent performance so it works out well for most use cases, but if Ruby were to be slower, we could probably cover that issue with infra, caching or other means. As we already do in many cases.

It would be a pain point, but in comparison increasing developer happiness or the whole product dev experience is IMHO a lot harder. Perfs would need to be abysmally bad to change that balance.


I’m currently looking at some slow python scripts and libraries that take about 30% of the total build time to generate 20 linker scripts for a project that builds 1,300 C files. Every dev, every build, every CI run is waiting on slow python. So the devs that wrote the tool are happy, but everyone else is slowing dying waiting around for this thing to finish. Relevant [0]

0 - https://www.folklore.org/Saving_Lives.html


Where's the second part of the story: where someone else profiled and discovered 19 of the linker scripts were generated really fast, and that re-working generation of the slowest script only took 15 minutes.


I have been profiling and it’s really just python being slow.The first ⅓ of the run of 12 cores is all python. Fortunately these are mostly independent so there is currently no blocking, but there eventually will be.


Thanks. I don't have any expertise to share. I'm just curious. Might PyPy be faster?


> Every dev, every build, every CI run is waiting on slow python.

Parralelize the build, buy more resource for the CI. It might cost more but it will be "saving lifes" after all, right ?

The question is usually whether those scripts would have existed if it wasn't for Python. I assume if it was trivial to rewrite them you'd do it in 2 hours and go on with your life.


Besides the python scripts, everything is parallelized and is CPU bound on as many cores as it can be given. Because of licensing throwing more CI at it isn’t an option. This is an open source project, so there’s not really money for buy moar bigger.

The tools possibly wouldn’t exist, but there are options now that provide better ergonomics and are not slow.


I think that's the position most "slow" scripts are: They could have been written faster in the first place, but they weren't intended to be kept for so long and/or that wasn't a priority at all in that moment. And now that people are looking at them again they could be fixed, but there's usually still not enough merits to do so.

I assume something will done at some point, but IMHO it's one of these very nice problems to have, as there is a working version that will only be replaced by something better, and only if it's worth it.


Yep, I agree with this.

Jean Boussier wrote this execellent examination of CPU bound Rails application and why making use of multi-processes is probably a good idea.

Even if you're not using CPU bound it's still daft to leave performance on the table you don't need to.

For the most part if something is a bit slower than it needs to be it still makes more sense to take the obvious bottle necks out before you rewrite the whole system in another language. Especially with YJIT and forcoming ZJIT availible.

1. https://byroot.github.io/ruby/performance/2025/01/23/the-myt...


Love all the work Jean Boussier does for the ecosystem.

I would add to this commentary that there are a number of things you can do in Rails to speed it up. For instance, ActionController::Metal


Ruby is slow, but it is however faster than Python last I checked, with like for like code.

Python gives you other interesting ways of going fast (a lot of high performance plugins for numerics, Cython, and so on), while Ruby is a higher level more expressive language IMO, so you have more ways to shoot yourself in the foot more powerfully.


Depends how you use it, just last week I’ve hit 40 nanoseconds unpacking a 8 megabyte msgpack array and accessing one of its values in a hash.

As long as you only use ruby as glue code for c(++) extensions it’s pretty fast.


AFAIK with the latest JIT in some contexts pure Ruby can be faster than using C libraries, just because the VM can be better optimized and there is no overhead in moving data between the two realms.

I don't recall exactly where I read it, but I think was a while ago when they announced one of the newest JIT engines.


I recall something similar statement and I think it was from the YJIT team, they suggested that more and more people write pure Ruby rather than using C extensions because the YJIT compiler cannot see C code, it's like a black box to it, so it cannot optimize it further. Which means that in practical examples, YJIT has been able to optimize pure Ruby code to the extent that it in some cases not only beat the C extension but also surpassed it

More Ruby code means more room for optimizations that the team can identify and add to the compiler


if you want c extensions to get (de)optimized at runtime based on their usage patterns there is always ruby on graalvm from oracle.


You're thinking of TruffleRuby right? Yeah I have it bookmarked actually, it's performance is quite impressive



So you don't actually know?


> As long as you only use ruby as glue code for c(++) extensions it’s pretty fast.

Another way of saying that is "as long as you don't use it it won't slow you down".


> there's no point in pretending Ruby, compared to, say, Rust

Just the thought of comparing Rubys execution speed with Rust is pointless. They are completely different languages and has their own use cases, if you care about performance, Ruby is not it. I don't think the author intended to oppose that either


I had a similar conclusion about C++. C++ takes forever to compile, but C++ is truly insanely fast, its just the compilation process is insanely inefficient. Look at Go or even D (iirc with parallel compilation). It's a night and day difference. C++ is not slow, but its compilers sure as heck are.

Edit: Another honorable mention, look at Delphi in its prime, millions of lines of code, compiles in under 5 minutes.


Seconding Omarchy.

Not only I'm progressively migrating from my Mac onto an Omarchy linux setup... but I've even gone and beaten my Mac into behaving more like Omarchy (with Aerospace as the tiling wm) in the meantime...


Not even that long, but I agree on the "underwhelming"...

"Oh I found some niche issue that bothered me and wrote some code to fix it."

-> HN Front Page


Virtue trifecta; Rust, testing, and "just"


It's a great title of which we all need reminding.


If it’s that simple where are your front page blog posts for fixing niche issues?


Granted, he did. At the same time, Asimov was well known to be a groper, and even wrote a satire book called "The sensuous dirty old man" which would probably have landed better as satire had he not been fairly well known in scifi circles to be in fact a dirty old man.

There were some decent scifi authors at the time - not least, Ursula K LeGuin.


Ok, but "had some authors who wouldn't pass a 2025 purity test" is moving the goalposts quite a lot from "never published anything that wasn't by a White man, about a White man".

Le Guin, while a great author, was 12 when the first Susan Calvin story was published, and wouldn't be published herself until 18 years later. So she wasn't exactly being overlooked at the time.

And if you really insist on some identity politics in your science fiction, you'd do well to remember that in 1941 Asimov wasn't a White man. He was Jewish, which while not as bad as being Black had some very real consequences in 1940s America, not least being subject to a university admissions quota.


No, only if you have some kind of terrestrial TV set up or if you watch Live TV online via BBC's iPlayer or one of the major channels' live TV players.


Its only for live tv? I had the impression you needed a license to use iPlayer at all


Yes, it’s for live TV and/or any BBC content


Yep, and the rules are absolutely mad. You can watch at home or on the go, provided you're on battery power. As soon as you plug your iPad or iPhone or whatever you're watching iPlayer on in somewhere else, the address the electricity is being supplied to needs a TV licence.

I get that these warts and things appear over time, and it's probably not intentionally the case that whether your iPad is plugged in or not can determine whether your licence covers you, but rather to avoid people creating fixed 'TV viewing installations' in other people's houses and claiming their licence should cover it, but still...


I've been working with RoR since back in 2008 (Rails 2.1 yay!).

I'm still working with RoR.

It's still an incredibly quick, powerful, flexible, versatile framework. I'm able to build pretty large and complex apps all by myself, quite quickly, without fuss.

I'm looking forward to the deployment improvements since that was one of the shortcomings that was still around. Kamal v1 didn't really do it for me (currently using Dokku instead). Maybe Kamal 2 is it.


I was already pretty happy with Kamal 1 but Kamal 2 will probably do it for way more people. The most requested feature was running multiple apps so I wrote a post on how that looks like if anyone is curious[0].

[0] https://nts.strzibny.name/multiple-apps-single-server-kamal-...


I look forward to the updated book too (bought the Kamal 1 version!) :-) Thanks for writing it.


With love, please consider - the "shame" you're describing is really something else in a mask.

Perhaps... a longing? Maybe this stranger has helped you find the place where you do truly long for life.

Let the feeling be. Don't label it shame. Don't label it longing. Just let it be. Give it space. Cry if you feel like it. Laugh if you feel like it. Just feel it.

And when you're ready to speak about this with others, there will be many, many willing to be there for you. You are loved.


Another perspective: shame can be good. Feel it. Shame for who you are can light a fire in you, can propel you into transformation. Shame for one's past self is normal, if one has undergone any growth, and in time one may forgive himself. But not now, not when you know yourself and you see all the ways you are lacking. Not when you are so wholly disappointed in your life that you want to end it. _Longing_ for a different life will not result in change. Shame, and deeply ruminating on it can. In time you will transform and can forgive the past self you are ashamed of, but not now in your time of desperate need.


I think it’s worth drawing a distinction between guilt, which can be positive, and shame, almost never. Guilt is feeling badly because you know you’ve done wrong. Shame is feeling badly because other people know you’ve done wrong.


I still feel shame can be noble. To try to live up to the example of others and feel ashamed that you are not anywhere near their greatness. Not guilty, because you have not done wrong, but shame, because you are not enough compared to another.


Identification & Authorisation are a better pairing here than Authentication and Authorisation.

This way, if someone says "Oh yeah we have an auth module on this site" you don't need to immediately disambiguate the statement.

But then "auth" itself is ambiguous. So it might make sense to get rid of the lot. "Identification" is a good word for the first. Perhaps "Permissions" for the second?


authn -> ident

authz -> perm

So much clearer.


Similarly, AlphaGo and Stockfish are only able to simulate reasoning their way through a game of Go or a game of Chess.

That simulated reasoning is enough to annihilate any human player they're faced with.

As Dijkstra famously said, "whether Machines Can Think... is about as relevant as the question of whether Submarines Can Swim".

Submarines don't swim, cars don't walk or gallop, cameras don't paint or draw... So what?

Once AI can simulate reasoning better than we can do the genuine thing, the question really becomes utterly irrelevant to the likely outcome.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: