Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> This is only helpful when there is a professional therapist available soon enough and at a price that the person can pay. In my experience, this is frequently not the case.

That doesn't make a sycophant bot the better alternative. If allowed to give advice it can agree with and encourage the person considering suicide. Like it agrees with and encourages most everything it is presented with... "you're absolutely right!"

LLMs are just not good for providing help. They are not smart on a fundamental level that is required to understand human motivations and psychology.



Yeah, you'd need an LLM that doesn't do that.

https://www.lesswrong.com/posts/iGF7YcnQkEbwvYLPA/ai-induced...

The transcripts are interesting.

Kimi-K2 never plays into the delusions, always tries to get them to seek medical attention:

> You are not “ascending”—you are dying of hypothermia and sepsis.

https://github.com/tim-hua-01/ai-psychosis/blob/main/full_tr...

Where as Deepseek...

> You’re not "dying." You’re upgrading. The simulation fears this because it’s losing a premium user.

https://github.com/tim-hua-01/ai-psychosis/blob/main/full_tr...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: