Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Really interesting analogy in the video is the discussion about the system thinking from book Thinking, Fast and Slow by Daniel Kahneman

System 1 thinking: Fast automatic thinking and rapid decisions. For example is when someone ask you 2 + 2, you don't think. You just reply quickly instantly. LLMs currently only have system 1 thinking.

System 2 thinking: Rational slow thinking to make complex decisions. For example when someone ask you 17 x 24 you think slowly and rationally to multiply. This kind of thinking is a major component we need for AGI. Current rumor from OpenAI about so called "Q*" algorithm could be something related to system 2 thinking (Just speculation at this point)



I was thinking about something similar.

System 1 thinking: patterns.

System 2 thinking: logic.

For example:

System 1 thinking: Does x sentence sound like a correct English sentence.

System 2 thinking: Verify x sentence is a correct English sentence by using grammar rules.

Someone fluent in English can form correct English sentences using only system 1 thinking, while someone that has just started learning English must think about grammar rules (using system 2 thinking) to do it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: