Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The die size of the B580 is 272 mm2, which is a lot of silicon for $249. The performance of the GPU is good for its price but bad for its die size. Manufacturing cost is closely tied to die size.

272 mm2 puts the B580 in the same league as the Radeon 7700XT, a $449 card, and the GeForce 4070 Super, which is $599. The idea that Intel is selling these cards at a loss sounds reasonable to me.



Though you assume the prices of the competition are reasonable. There are plenty of reasons for them not to be. Availability issues, lack of competition, other more lucrative avenues etc.

Intel has neither, or at least not as much of them.


At a loss seems a bit overly dramatic. I'd guess Nvidia sells SKUs for three times their marginal cost. Intel is probably operating at cost without any hopes of recouping R&D with the current SKUs, but that's reasonable for an aspiring competitor.


It kinda seems they are covering the cost of throwing massive amounts of resources trying to get Arc’s drivers in shape.


I really hope they stick with it and become a viable competitor in every market segment a few more years down the line.


The drivers are shared by their iGPUs, so the cost of improving the drivers is likely shared by those.


The idea that Intel is selling these at a loss does not sound reasonable to me:

https://news.ycombinator.com/item?id=42505496

The only way this would be at a loss is if they refuse to raise production to meet demand. That said, I believe their margins on these are unusually low for the industry. They might even fall into razor thin territory.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: