Hacker Newsnew | past | comments | ask | show | jobs | submit | wr2's commentslogin

This is completely false - GPUs are not consumables, they are factors of production.

Models are technologies. Without the GPUs the technology is not accessible.

You sound like someone who thinks they have a strong understanding of economics when they don't.


I had to look up "factors of production" to see what this was about.

Looks to me like, as with a drill bit, a GPU could be reasonably classified as either a consumable or a factor of production.

This is because GPUs wear out and fail; the smaller the features, the faster electromigration kills them.


Lol are people like you going to be enough to support the large revenues? Nope.

A firm that see's rising operating expenses but no not enough increase in revenue will start to cut back on spending on LLMs and become very frugal (e.g. rationing).


Before they cut back on human programmers?

Also a good point - railroads for sure brought a lot more optimism.

LLMs+Data centres on the other hand...


Lol a bit dramatic at the end. There will be a correction in stocks that were priced in for growth related to AI.

But what I see is the two big costs for America:

1) Less money being invested into risky AI projects in general, in both public (via cash flows from operations) and private markets 2) The large tech firms who participated in large capex spend related to AI projects won't be trusted with their cash balances - aka having to return more cash and therefore less money for reinvestment

All the hype and fanfare that draws in investment at al comes with a cost - you gotta deliver. People have an asymmetric relationship between gains and losses.


Also railways would always have alternative uses at that time - e.g. logistics in warfare.

What other uses do GPU's have that are critical...? lol

In addition to your points, this is why I always laugh when people do backward comparisons. What characteristics do they share in common? Very little.


GPUs do have a use in warfare though. I mean, LLMs are basically offensive weapons disguised as software engineers.

Sure, LLMs can kind of put together a prototype of some CRUD app, so long as it doesn’t need to be maintainable, understandable, innovative or secure. But they excel at persisting until some arbitrary well defined condition is met, and it appears to be the case that “you gain entry to system X” works well as one of those conditions.

Given the amount of industrial infrastructure connected to the internet, and the ways in which it can break, LLMs are at some point going to be used as weapons. And it seems likely that they’ll be rather effective.

FWIW, people first saw TNT as a way to dye things yellow, and then as a mining tool. So LLMs starting out as chatbots and then being seen as (bad) software engineers does put them in good company.


Imagine comparing something that has a useful life of 100+ years vs a thing that is worn out, much less durable, and needs replacing much more often and can become obselete from innovation within its own product category.

Comical. China can continue innovating on GPUs and all this existing spend to stock up on compute is a waste. Again, comical. Moreover China has energy capacity that the US does not. Meaning all those GPU's that deliver less performance per watt? Yep going in the bin.

So yeah.. carry on telling me how this is going to yield some supreme advantage lmao.


> GPUs do have a use in warfare though.

Unclassified public cloud GPUs are completely useless when your warfighting workloads are at the SECRET level or above.


They’re unclassified public cloud GPUs today, much the same as the massive industrial base of the United States was churning out harmless consumer widgets in 1939. Those widget makers happened to be reconfigurable into weapon makers, and so wartime production exploded from 2% to 40% of GDP in 5 years [1]. But the total industrial output of course didn’t expand by nearly that much.

I think it’s maybe plausible that private compute feels similar in the next do-or-die global war.

[1] https://eh.net/encyclopedia/the-american-economy-during-worl...


The United States has almost no domestic capability to produce advanced semiconductors. There is no abundance of industrial capacity cranking out GPUs that can be quickly diverted from AI companies into weapon systems.

Even if private compute was at a level of maturity where you could use it for classified workloads, knowing that the infrastructure is being managed by someone in India or China, securely getting data into and out of that infrastructure is still a mostly unsolvable problem.


My point is the existing private DCs can be reconfigured for a different use. Building new gpus is not required to on-shore compute. We already have it. Obviously if the military started contracting out compute onto the hyperscalar clusters it would involve a host of changes. I wasn’t aware that they were letting India and China manage their infrastructure… That seems exceedingly unlikely? That relationship would obviously be severed if the compute was reconfigured for the military.

The US is one of the very few countries with the ability to produce advanced semiconductors.

US is probably second only to Taiwan in terms of capacity to build advanced semiconductors and the gap is now closing as Intel gets back on track.

wut? Intel with 18A can do it

Its low yields and tiny volumes are part of what gets the US from “no capacity” to “almost no capacity.”

yields are constantly improving on monthly basis, according to executives around 7% per month, so the capability is definitely there, but yields still needs some time

On the topic of warfare, wars are fought differently now. Compute will be mentioned in the same breath as total manufacturing output if a global war between superpowers erupts. In highly competitive industries this is already the case. Compute will be part of industrial mobilization in the same way that physical manufacturing or transportation capacity were mobilized in WWII. I’m not an expert on military computing but my intuition is that FLOPS are probably even more easily fungible into wartime compute than widget makers, and the US was able to go widgets->weapons on an unbelievable scale last time.

There are plenty of military uses for computing, but I also find it hard to believe anything but a handful of datacenters are or could be a major factor in anything but a completely 1 sided war. They are very vulnerable targets that are easy to locate and require large amounts of power and cooling. I also just don't see the application, encryption capabilities far exceed the compute available needed for decryption and computing precision and speed with even 20 year old tech far exceeds the precision of anything you would want to control. Even with tangible banefits, say 10% more or less casualties than there would be otherwise, in an exchange with anything resembling a peer military force im not sure it matters because everybody already loses.

Is that in terms of data centres or chips on the battlefield? Surely the latter is most important. Or will war alwys have perfect connectivity.

You could argue that compute was a decisive factor in World War II even (used in code breaking and designing nuclear weapons).

> What other uses do GPU's have that are critical...? lol

GPUs are essential to every kind of scientific and engineering simulation you can think of. AI-accelerated simulations are a huge deal now.


GPUs that have lives of..?

Now compare that with the life a rail road. Amusing.


Some of those railroad bridges might never have been constructed without those simulations.

Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: