There was a time when a file of source code might not fit in memory, or would take up a significant fraction of it. But it hasn't been the case on any developer machine in 20+ years. And the overhead of FILE * accessors like fgetc is substantial. Strings in memory are always going to be faster.
Well, the overhead of the stream API is in the noise. If the lexer / parser do not support incremental parsing, it doesn't really matter. But incremental parsing can be important in some situations. For instance, if you're parsing a 1GB json blob keeping the whole thing in memory at once can easily be an issue. Plus, if you stall waiting for the entire input string, you end up adding to latency, if that matters.