Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The problem is the use case is where you don't care about the risk of hallucinations or you can validate the output without already having the data in a useful format. Plus you need to lack the knowledge/skill to do it more quickly using awk/python/perl/whatever.


I think text transformation is a sufficiently predictable task that one could make a transformer that completely avoids hallucinations. Most LLMs have high temperatures which introduces randomness and therefore hallucinations into the result.


That's why having good test suites and tools are more important than ever.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: