It's 2025 and every useful conversation with an LLM ends with context exhaustion. There are those who argue this is a feature and not a bug. Or that the context lengths we have are enough. I think they lack imagination. True general intelligence lies on the other side of infinite context length. Memory makes computation universal, remember? http://thinks.lol/2025/01/memory-makes-computation-universal...
It's 2025 and every useful conversation with an LLM ends with context exhaustion. There are those who argue this is a feature and not a bug. Or that the context lengths we have are enough. I think they lack imagination. True general intelligence lies on the other side of infinite context length. Memory makes computation universal, remember? http://thinks.lol/2025/01/memory-makes-computation-universal...