Hacker Newsnew | past | comments | ask | show | jobs | submit | Lionga's commentslogin

It goes the other way too. It's hard to imagine a {EU,German,whatever} institution releasing a scientific study that directly contradicts the administration's viewpoints out of fear of reprisal via loss of funding or even shakedowns.

This is specific to the Trump administration. Previous administrations actually took critique and updated policies and advice based on the critique.

Nonsensical whataboutism.


After waiting 5 minutes, the only feedback I get was "You would've gotten a better answer with Phind Pro. Upgrade to unlock multiple rounds of searching for better answers -- automatically. Upgrade to Phind Plus, Pro, or Ultra to continue researching in depth!"

Not a single thing was actually shown or build. Astonishing what kind of crapware gets funded by YC if they slap AI on the application


Hey to be fair getting in the front page of HN floods a site with traffic and that’s even harder for an AI app. Just wait a bit and will likely be fine.

Congrats on the launch and keep up the great work.


Hi, sorry about that -- we are receiving an HN traffic "hug" spike right now and I'm working on getting that fixed ASAP.

It shows up, but like most of AI Slop it is not working. As the commenter said

Dario Amodei gives of strong Adam Neumann vibes. He claimed "AI will replace 90% of developers within 6 months" about a year ago...

It was "writing 90% of the code", which seems to be pretty accurate, if not conservative, for those keeping up with the latest tools.

> which seems to be pretty accurate

It's not, even by his own citing: https://www.youtube.com/watch?v=iWs71LtxpTE

He said that this applies to "many teams" rather than "uniformly across the whole company".


Yes, those using the tools use the tools, but I don't really see those developers absolutely outpacing the rest of developers who do it the old fashioned way still.

I think you're definitely right, for the moment. I've been forcing myself to use/learn the tools almost exclusively for the past 3-4 months and I was definitely not seeing any big wins early on, but improvement (of my skills and the tools) has been steady and positive, and right now I'd say I'm ahead of where I was the old-fashioned way, but on an uneven basis. Some things I'm probably still behind on, others I'm way ahead. My workflow is also evolving and my output is of higher quality (especially tests/docs). A year from now I'll be shocked if doing nearly anything without some kind of augmented tooling doesn't feel tremendously slow and/or low-quality.

it’s wild that engineers need months or years to properly learn programming languages but dismiss AI tooling after one bad interaction

I think inertia and determinism play roles here. If you invest months in learning an established programming language, it's not likely to change much during that time, nor in the months (and years) that follow. Your hard-earned knowledge is durable and easy to keep up to date.

In the AI coding and tooling space everything seems to be constantly changing: which models, what workflows, what tools are in favor are all in flux. My hesitancy to dive in and regularly include AI tooling in my own programming workflow is largely about that. I'd rather wait until the dust has settled some.


totally fair. I do think a lot of the learnings remain relevant (stuff I learned back in April is still roughly what I do now), and I am increasingly seeing people share the same learnings; tips & tricks that work and whatnot (i.e. I think we’re getting to the dust settling about now? maybe a few more months? definitely uneven distribution)

also FWIW I think healthy skepticism is great; but developers outright denying this technology will be useful going forward are in for a rude awakening IMO


Motivated reasoning combined with incomplete truths is the perfect recipe for this.

I kind of get it, especially if you are stuck on some shitty enterprise AI offering from 2024.

But overall it’s rather silly and immature.


That's not even close. The keyboard is writing 100% of my code. They keyboard is not replacing me anytime soon.

If you added up all the code written globally on Dec 3 2025, how much do you think was written by AI and how much was clacked out on a keyboard?

And 12 months later Anthropic is listing 200 open positions for humans: https://www.anthropic.com/jobs

Of course they are. The two things aren’t contradictory at all, in fact one strongly implies the other. If AI is writing 90% of your code, that means the total contribution of a developer is 10× the code they would write without AI. This means you get way more value per developer, so why wouldn’t you keep hiring developers?

This idea that “AI writes 90% of our code” means you don’t need developers seems to spring from a belief that there is a fixed amount of software to produce, so if AI is doing 90% of it then you only need 10% of the developers. So far, the world’s appetite for software is insatiable and every time we get more productive, we use the same amount of effort to build more software than before.

The point at which Anthropic will stop hiring developers is when AI meets or exceeds the capabilities of the best human developers. Then they can just buy more servers instead of hiring developers. But nobody is claiming AI is capable of that so far, so of course they are going to capitalise on their productivity gains by hiring more developers.


If AI is making developers (inside Anthropic or out) 10x more productive... where's all the software?

I'm not an LLM luddite, they are useful tools, but people with vested interests make a lot of claims that if they were true would result in a situation where we should already be seeing the signs of a giant software renaissance... and I just haven't seen that. Like, at all.

I see a lot more blogging and influncer peddling about how AI is going to change everything than I do any actual signs of AI changing much of anything.


How much software do you think happened at Google internally during its first 10 years of existence that never saw outside light? I imagine that they have a lot of internal projects that we have no idea they even need.

But this AI boom is supposedly lifting all boats, internal and external.

That's the hype being sold. So where's the software...?

And again, I'm not anti-LLM. But I still think the hype around them is far, far greater than their real impact.



Here's the claim again for you:

> AI will replace 90% of developers within 6 months


You said:

> The two things aren’t contradictory at all, in fact one strongly implies the other. If AI is writing 90% of your code, that means the total contribution of a developer is 10× the code they would write without AI. This means you get way more value per developer, so why wouldn’t you keep hiring developers?

Let's review the original claim:

> AI will replace 90% of developers within 6 months

Notice that the original claim does not say "developers will remain the same amount, they will just be 10x more effective". It says the opposite of what you claim it says. The word "replace" very clearly implies loss of job.


> Let's review the original claim:

> > AI will replace 90% of developers within 6 months

That’s not the original claim though; that’s a misrepresentative paraphrase of the original claim, which was that AI will be writing 90% of the code with a developer driving it.


Huh. You seem to be right. It seems I was responding to a comment which misquoted Dario.

that’s not what he claimed, just to be clear. I’m too lazy to look up the full quote but not lazy enough to not comment this is A) out of context B) mis-phrased as to entirely misconstrue the already taken-out-of-context quote

I think it was also back in March, not a year ago


https://www.businessinsider.com/anthropic-ceo-ai-90-percent-... (March 2025):

>"I think we will be there in three to six months, where AI is writing 90% of the code. And then, in 12 months, we may be in a world where AI is writing essentially all of the code," Amodei said at a Council of Foreign Relations event on Monday.

>Amodei said software developers would still have a role to play in the near term. This is because humans will have to feed the AI models with design features and conditions, he said.

>"But on the other hand, I think that eventually all those little islands will get picked off by AI systems. And then, we will eventually reach the point where the AIs can do everything that humans can. And I think that will happen in every industry," Amodei said.

I think it's a silly and poorly defined claim.


you’re once again cutting the quote short — after “all of the code” he has more to say that’s very important for understanding the context and avoiding this rage-bait BS we all love to engage in

edit: sorry you mostly included it paraphrased; it does a disservice (I understand it’s largely the media’s fault) to cut that full quote short though. I’m trying to specifically address someone claiming this person said 90% of developers would be replaced in a year over a year ago, which is beyond misleading

edit to put the full quote higher:

> "and in 12 months, we might be in a world where the ai is writing essentially all of the code. But the programmer still needs to specify what are the conditions of what you're doing. What is the overall design decision. How we collaborate with other code that has been written. How do we have some common sense with whether this is a secure design or an insecure design. So as long as there are these small pieces that a programmer has to do, then I think human productivity will actually be enhanced"


can you post the full quote then? He has posted what the rest of us read

I believe:

> "and in 12 months, we might be in a world where the ai is writing essentially all of the code. But the programmer still needs to specify what are the conditions of what you're doing. What is the overall design decision. How we collaborate with other code that has been written. How do we have some common sense with whether this is a secure design or an insecure design. So as long as there are these small pieces that a programmer has to do, then I think human productivity will actually be enhanced"

from https://www.youtube.com/live/esCSpbDPJik?si=kYt9oSD5bZxNE-Mn

(sorry have been responding quickly on my phone between things; misquotes like this annoy the fuck out of me)


[dead]


uh it proves the original comment I responded to is extremely misleading (which is my only point here); CEO did not say 90% of developers would be replaced, at all

Is this the new 'next year is the year of the Linux desktop'?

Why not go to Haiti if it so good there? No doubt US (or basically 90% of other gov.) are shit, but Haiti must be one of the worst.

It seems to work well if you DONT really know what you are doing. Because you can not spot the issues.

If you know what you are doing it works kind of mid. You see how anything more then a prototype will create lots of issues in the long run.

Dunning-Kruger effect in action.


But it works it was peer reviewed! (by AI)

Based on your definition a child that can not speak/understand language yet can not think? Hint: It clearly can.

There are a lot of things I can think about that I do not have words for. I can only communicate these things in a unclear way, as language is clearly a subset of thought, not a superset.

Only if your definition of thought is that is is language based, which is just typical philosophy circular logic.


I've started to believe that language is often anti-thought. When we are doing what LLMs do, we aren't really thinking, we're just imitating sounds based on a sound stimulus.

Learning a second language let me notice how much of language has no content. When you're listening to meaningless things in your second language, you think you're misunderstanding what they're saying. When you listen to meaningless things in your first language, you've been taught to let the right texture of words slip right in. That you can reproduce an original and passable variation of this emptiness on command makes it seem like it's really cells indicating that they're from the same organism, not "thought." Not being able to do it triggers an immune response.

The fact that we can use it to encode thoughts for later review confuses us about what it is. The reason why it can be used to encode thoughts is because it was used to train us from birth, paired with actual simultaneous physical stimulus. But the physical stimulus is the important part, language is just a spurious association. A spurious association that ultimately is used to carry messages from the dead and the absent, so is essential to how human evolution has proceeded, but it's still an abused, repurposed protocol.

I'm an epiphenomenalist, though.


>Learning a second language let me notice how much of language has no content.

What on earth do you mean?


I see what you did there. :)

how will the poor engineers get promotions if they can not write "Launch feature X" (broken, half baked) on their promotion requests? Nobody ever got promoted for fixing bugs or keeping software useable.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: