Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

GPT4 on the web is at a loss, $20/mo doesn't cover it unless someone's a rare user. GPT4 through API is not at a loss, and I doubt GPT3.5 is at a loss either.


Given that you can buy a at least a couple hours worth of GPU compute at $20, it's hard to believe the average user spends that much time (specifically waiting for the response) on GPT-4 on the web every month... Not to mention that they probably have optimizations that batch together queries at the same time and make things more efficient at scale...


These assertions are based off of what?


the assertions were based on a prompt sent to chatgpt: "are you more efficient than google translate, cost-wise?"




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: