Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Ollama isn't an inference engine, its a GUI slapped onto a perpetually out-of-date vendored copy of Llama.cpp underneath.

So, if you're trying to actually count LLama.cpp downloads, you'd combine those two. Also, I imagine most users on OSX aren't using Homebrew, they're getting it directly from the GH releases, so you'd also have to count those.



Actually, ollama has stopped using llama.cpp and is using ggml directly nowadays.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: