Vector space search engine, vector database and an anything key/value store. Powers efficient string processing, vector operations, and custom storage primitives designed for speed and simplicity. Can produce large language models out of strings and large anything models out of byte arrays.
We're at peak ape technology but at this point in time for some reason we turn to a specific ape, not a smart one but an ape who feels he should rule the world, and is on the doorstep to world domination, one that feels entitled to this amount of power, and we ask him to do "the right thing"?
America, don't be the next force the rest of the world needs to fight. Wake the fuck up.
Most commonly it is pickled herring. I don't think rotten herring is a thing in any broader circles.
Homemade bäsk is usually much better than factory made Bäska Droppar, if you enjoy the taste of wormwood more than just being slapped in the face with artificial bitterness and sugar.
Of course, if you hate the taste of fish, pickled things and spirits in general, you are unlikely to enjoy any of it.
Also not quite the same as rotten! And, I would hazard a guess that the consumption ratio of inlagd-sill to surströmming is at least 1000:1, maybe 100000:1.
Aquavit equally disgusting? I guess I should try malört :) Aquavit with beer chaser and https://en.wikipedia.org/wiki/Pinnekj%C3%B8tt is a great Christmas dish (but I would die if I didn't keep it to Christmas)
It seems like a useful adaptation of the term to a new usage, but I can understand if your objection is that it promotes anthropomorphizing these types of models. What do you think we should call this kind output, instead of hallucination?
Maybe another way of looking at it is - the paper is attempting to explain what LLMs are actually doing to people who have already anthropomorphised them.
Sometimes, to lead people out of a wrong belief or worldview, you have to meet them where they currently are first.
> In this paper, we argue against the view that when ChatGPT and the like produce false claims they are lying or even hallucinating, and in favour of the position that the activity they are engaged in is bullshitting, in the Frankfurtian sense (Frankfurt, 2002, 2005). Because these programs cannot themselves be concerned with truth, and because they are designed to produce text that looks truth-apt without any actual concern for truth, it seems appropriate to call their outputs bullshit.
> We think that this is worth paying attention to. Descriptions of new technology, including metaphorical ones, guide policymakers’ and the public’s understanding of new technology; they also inform applications of the new technology. They tell us what the technology is for and what it can be expected to do. Currently, false statements by ChatGPT and other large language models are described as “hallucinations”, which give policymakers and the public the idea that these systems are misrepresenting the world, and describing what they “see”. We argue that this is an inapt metaphor which will misinform the public, policymakers, and other interested parties.
The criticism that people shouldn't anthropomorphize AI models that are deliberately and specifically replicating human behavior is already so tired. I think we need to accept that human traits will no longer be unique to humans (if they ever were, if you expand the analysis to non-human species), and that attributing these emergent traits to non-humans is justified.
"Hallucination" may not be the optimal metaphor for LLM falsehoods, but some humans absolutely regularly spout bullshit in the same way that LLMs do - the same sort of inaccurate responses generated from the same loose past associations.
Out of spite and because I could, I deleted all the text in the noyaml site and then clicked "save". Sorry if you were too slow to read it. Might find it in the waybackmachine?
Well said about not realizing it's important therefore names get down prioritized. Here's a counter point. I'm a movie geek. Yet when I reach for a name of someone from the cast I almost always end up describing, y'know, that guy who played together with that other guy in that movie, y'know, the one with the weird story line? Him! Yes, him. Love him.