Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

For me, where electricity is $0.45/kWh, assuming 1kW consumption, it would be around $80 USD/million!


I think you might have to show your math on that one.


They said 1.5 tokens/second. 1 mil tokens is 667k seconds is 185 hours per million tokens. 1kW * 185hr * $0.45/kWh = $80 per million tokens. Again, assuming 1kW, which may be high (or low). The cost of the physical computation is electricity cost.


They said it has a crappy GPU, the whole computer probably only uses 200 - 250 watts.


No way. 768GB of ram will have significant power draw. DDR4 (which this probably is) is something like 3W/8GB. That's > 250W alone.

So, say 500W. That's, for me in my expensive electricity city, $40/million tokens, with the pretty severe rate limit of 5600 tokens/hours.

If you're in Texas, that would be closer to $10/million tokens! Now you're at the same price as GPT-4o.


But you can run and experiment with any model of your liking. And your data does not leave your desktop environment. You can build services. I don't think anybody doing this is doing it to save $20 a month.


Yes. I was only making a monetary comparison.

Related, you can get a whole lot of cloud computing for $2k, for those same experiments, on much faster hardware.

But yes, the data stays local. And, it's fun.

This comment chain is pretty funny.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: