Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> That would be truthfulness to the training material, I guess. If you train on Reddit posts, it’s questionable how true the output really is.

Maybe it learns to see when something is true, even if you don't feed it true statements all the time (?)



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: