Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

They're better at it if you don't actually care about the truth of the result though.

I.e confirming your opinion, compiling a report about something you're not gonna take action on etc



I'm bullish on LLMs. They're incredibly useful for quickly collating and present info on a topic you're proficient or competent in, but maybe rusty. Using programming as an example, this morning I asked claude to write a batch file that launched 4 instances of my application and tiled their windows across my screen (on windows), wait for input on the command line and kill them all again. It spat out a working script in about 15 seconds. It's not pretty, but I know enough batch and Win32 to know that it's going to work (and it does).


Any different with google results?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: