You don’t just “use a gpu.” Software doesnt get better by magically throwing it at a gpu.
Moreover you can’t just run any old software on a gpu, it has to be built for it.
Even then, the gpu is just a speed increase, not a magical make better box.
That’s like saying any random text editor would be a better translator if they ran on GPUs.
EDIT: google also literally builds its own acceleration hardware, suggesting that they can’t afford GPUs (which they already own for GCP) for google translate is weird.
You have it backwards. For any given amount of compute needed, GPU's are a lot cheaper than CPU's. But GPU's can't run all the code you can run on a CPU.
The reason models that use GPU's cost more to run, is that they tend to be A LOT more compute intensive. However, if you run the inference for the same models on CPU, they will be both much more expensive than on GPU's (or on specialized tensor silicon) and also slower.