Who knows maybe one day I'll look into fixing it. Until then I guess I'll just keep using my bash script with raycast that removes newlines and then pastes.
Nobody ever talked about beginnings. The question is, even if the universe always was,
why does it exist? It wouldn't be impossible for there to be nothing and never have been anything.
It's an abstraction. If there was no beginning there was never the possibility of nothing. Just because you can conceive of a question doesn't mean it is valid.
> To be able to predict accurately sentences that make sense, GPT-4 must have a internal way of representing concepts, such as "objects', "time", "family" and everything else under the sun.
That's more or less exactly what these models do by design, otherwise the predictors/estimators for the next token in a sequence would fail spectacularly. These models aren't just plucking random tokens out of a bag and ordering them based on some brute force large-scale memory lookup or whatever. There is an internal representation of the tokens in a more meaningful sense. That sense, however, is limited to the statistical/mathematical framework upon which the models are built. It's a huge (in my opinion completely unjustified, wishful thinking exercise) leap to call it "reasoning."
How can any system use a word such as "time" in a way that makes sense without representing the word? It definitely has at least one representation: the binary ASCII code 01110100011010010110110101100101.
Fun fact: the unusual way of saying eighty in French ("four twenties") actually came about through the local contact with Celtic languages! I'm linguistic terms this is called "substratum influence"
Telegram has this natively already