Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's fair, though with an LLM (at least one you're familiar with) you can shape it's behavior. Which is not too different compared to some black box script that i can't control or reason through with a human support. Granted the LLM will have the same stupid black box script, so in both cases it's weaponized stupidity against the consumer.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: