Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Respectfully, I disagree. An llm in my mind is a new compiler. Just it takes natural language and produces code.


It feels like we're talking about different technologies sometimes.

I find its a slightly improved google for vague questions. Or a doxygen writer.

Its all use I've found for any ai model since i first started playing with github copilot beta.

Ive been trying the newer models as they arrived, and found they're getting more verbose, more prone to hallucinating functions that dont exist, and more prone to praise me as a god when trying to ask about basic assumptions. (you're cutting to the heart of the matter)

What kind of code do you write where its somehow replacing coding itself? I spent 30 minutes trying to get mistral to write a basic bash script yesterday.


I am playing with open weights models at home and yeah they are like that ... I use Claude 3.7 @ work and yeah it is a lot better ... Sometimes it will flub things but it also can write large amounts of code ... Mostly how I want (the pareto principle comes into play for the parts I don't want though).

So for me, the future will tend towards this ... Currently the tech is early days, we have no way to steer thought.. We have no way to align it to our thought processes... But eventually we will get to I want x pls make and it will be able to do it well.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: