Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Ot literally can’t reason in any form or shape. It’s absolutely not AGI, not even close [1]

[1] we can’t really know how close or far that is, this is an unknown unknown. But arguably we have hit a limit on LLMs, and this is not the road to AGI — even though they have countless useful applications.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: