Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why use chatGPT instead of using the GPT-3 api directly?

For the type of queries you are doing (sending whole context), the output is comparable (and just as wrong) between chatGPT and GPT-3.



Afaik there's no way to build upon your prompt to have it change the output in specific ways in GPT-3?


You feed it back in as the final tokens of the next prompt.


ChatGPT is currently free, GPT-3 is not




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: