Hacker Newsnew | past | comments | ask | show | jobs | submit | falcor84's commentslogin

> will be given to your family once you die, sounds a bit macabre.

To me it sounds more than a bit macabre - depending on the familial relations, it would seem like a motive for them to commit suicide in order to provide for their children or for their children to murder them. I can already imagine the memoires being adapted into Netflix shows.


Many companies provide a life insurance benefit equal to 50%-150% of annual salary.

If your sport has any mortality or long term risk (concussions, cardiac events) then this could be seen as a nice extra insurance policy.


That's an interesting formulation. I'd actually be quite worried about a Manna-like world, where we have AGI and most humans don't have any economic value except as its "remote hands".

There's a bit of a circular argument here - even if we human always assign intrinsic value to ourselves and our kin, I don't see a clear argument that human capabilities will have external value to the economy at large.

"The economy" is entirely driven by human needs.

If you "unwind" all the complexities in modern supply chains, there are always human people paying for something they want at the leaf nodes.

Take the food and clothing industries as obvious examples. In some AI singularity scenario where all humans are unemployed and dirt poor, does all the food and clothing produced by the automated factories just end up in big piles because we naked and starving people can't afford to buy them?


There's nothing definitional about the economy being driven by human need. In a future scenario where there are superintelligent AIs, there's no reason why they wouldn't run their own economy for their own needs, collecting and processing materials to service each other's goals, for example of space exploration.

That's an interesting argument. I don't like it, but I can't prove it wrong, so maybe we're approaching a new era where this is true.

But we're clearly not there now, so I stand by my prediction for the medium future!


"The economy" is humans spending money on stuff and services. So if humands always assign intrinsic value to ourselves and our kin...

For economic purposes, "the economy" also includes corporations and governments.

Corporations and governments have counted amongst their property entities that they did not grant equal rights to, sometimes whom they did not even consider to be people. Humans have been treated in the past much as livestock and guide dogs still are.


This will break down when >30% of people are unemployed

It's another small source of friction. I don't know if biometrics are the solution, but I do find for example that I'm much more comfortable buying on a website I've used before and already has my card details, rather than giving them to a new website.

Some are harder drugs than others. I'd say that HN is caffeine, discord and whatsapp are alcohol, and Instagram is meth.

I would lump Tik Tok as the direct IG competitor imho. It doesn't make sense because that was not its roots, but the way I see people use IG is the way I also see people use Tik Tok.

That definitely happens, but I wish had the displeasure of working at companies that were enamored with the solution they have, and couldn't be convinced to look again at the problem and see how it's changed since they originally solved it. As with most anything, the best approach is to somewhere in the middle, combining a love for the problem with a drive to repeatedly solve it. And one of the best tools for that seems to be dog-fooding, when the people in the company really want to use it for themselves.

This is something else. I want an LLM to continuously interact with a persistent authored world, not just have it make things up as it goes.

This is awesome!

And I want to run it locally. Publish it on steam with a configurable LLM and I'll buy it.


You're assuming a world where humans are still needed to read the papers. I'm more worried about a future world where AIs do all of the work of progressing science and humans just become bystanders.

Why are you worried about that world? Is it because you expect science to progress too fast, or too slow?

Too fast. It's already coding too fast for us to follow, and from what I hear, it's doing incredible work in drug discovery. I don't see any barrier to it getting faster and faster, and with proper testing and tooling, getting more and more reliable, until the role that humans play in scientific advancement becomes at best akin to that of managers of sports teams.

I don't get this argument. Our nervous system is also heterogenous, why wouldn't AGI be based on an "executive functions" AI that manages per-function AIs?

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: