Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That is an issue, but there is a huge difference between having to slow the brain back down in certain situations and being unable to speed it up at all.

I don't expect to play an entire game on fast forward, after all.

Edit: new line about breakdown: I just assumed that whatever input was given would be sped up too. That part of the project seems far less complicated than the brain simulation itself.



My knowledge of brains is limited, but I'd think the issue remains the same even if you cut off all external inputs.

Basics like memory decay are also tied to the system clock. So if you run your brain at 1000x speed then it would probably simply forget everything almost immediately.

And if you make a "simple" patch that prevents it from ever forgetting anything then it would be overwhelmed because it is only wired to deal with a certain amount of memories at a time.

In terms of the DOS-Game analogy: We may be able to patch a game that originally ran in 256kb of Ram to run in 2GB and actually fill that up (because we disabled the garbage collector). But the game probably uses algorithms that break down when faced with such a large dataset.

At this point we're down to having to actually understand the game (or brain) in detail, in order to make the changes required for running at higher capacity.


Actually having a higher capacity will be tricky, yes. But at least there won't be cell decay in the scientists working 4000 hour weeks to figure it out.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: