Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The pop culture perception of AI just being image and text generators is incorrect. AI is many things, they all need tons of RAM. Google is rolling out self-driving taxis in more and more cities for instance.


Congrats on engaging with the facetious part of my comment, but I think the question still stands: do you think the current level of AI-driven data center demand will continue indefinitely?

I feel like the question of how many computers are needed to steer a bunch of self-driving taxis probably has an answer, and I bet it's not anything even remotely close to what would justify a decade's worth of maximum investment in silicon for AI data centers, which is what we were talking about.


Data center AI is also completely uninteresting/non-useful for self driving Taxis, or any other self driving vehicle.


Do you know comparatively how much GPU time training the models which run Waymo costs compared to Gemini? I'm genuinely curious, my assumption would be that Google has devoted at least as much GPU time in their datacenters to training Waymo models as they have Gemini models. But if it's significantly more efficient on training (or inference?) that's very interesting.


My note is specifically for operating them. Training the models, certainly can help.


A decade is far from indefinitely.


AI is needed to restart feudalism?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: