Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

why will it worsen up?


Unleashing a hallucinating LLM to make edits will creates so many subtle problems on such a scale that it may not be possible to clean it up once other edits are made on top.


They clearly stated that this is not the intention. The closest thing to this would be translation help. But even then they’ll undoubtedly include a notice like “this has been translated with the help of AI.” Along with a prompt to encourage the human to help improve upon the translation.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: