Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> even if you made an alphabet (strings whose characters had decimal places or something) that was bigger, then it would still be kinda boring cuz the hyper hyper turing machine would still be a lookup table (though maybe even the machine in my example could still just be a lookup table for an uncountably large language on account of having uncountably many tapes/heads).

That's pretty much what I wanted to mean. Although I called it "boring", it was not to take a jab but more so because there is not much effort to spend to understand the limitations of such a machine once you can encode the whole language into the machine's look-up table. Also, such an extension makes us lose the ability to simulate other extended machines in general. Extensions to Turing machines are interesting because of how they may alter the limits of computability (in a completely hypothetical setup) and complexity, and I had some fun when writing the response although I called that specific machine model boring because it ended up being too powerful to be interesting (from a computability theory perspective).



i had some more crackpot thoughts if you don't mind hearing them lol

the set of all strings is countably infinite like the natural numbers

however human language when actually spoken can contain recursive implicit meanings and subtext, so maybe even though it's just strings, if you made the implicit context explicit it would be uncountably infinite, much like the real numbers, even though in practice the meaning a human can grok from any given sentence is bounded like a ieee floating point number.

but i find it somewhat interesting that whereas all ieee floats have a certain amount of meaning they can carry around/a set amount of decimal places, the way we encode implicit meanings and tones into language is uneven, it's not character by character (but it can be if you add an accent or tone to a character even in non tonal languages),

like, what if you could design a number system more like human language where the decimal places somehow only showed up under certain conditions (maybe you could argue complex numbers are kinda like that, since the imaginary or real parts will appear and re apper under certain operations)

could you design a number system or an algebra in such a way that it was up for interpretation lol (or if not why not)

one thing is that you can interpret numbers in terms of geometry, but the interpretation is one to one, and thus boring compared to the interpretability of strings.

also maybe lambda calculus is somewhat interpretable in this way, and it can encode numbers and algebraic rules

maybe logical conclusion of this train of thought is the IO monad XD




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: