That's not the marketing message at the moment. I see ads for AI (LLM) powered services and they all say the same thing, "Stressed? Not enough time? Let AI do it faster so you can do more." AI is sold as a tool that can do things faster than a human can and since LLMs do not provide reference information, there is no telling where they got the data from and no way to verify it.
Validating that a complex spreadsheet is correct is notoriously extremely difficult at the best of times; unfortunately they are about the closest thing to a write-only language in common use, and you really have to front load a lot more care than you do in conventional modern languages. The usual safeguards of testing and code review are essentially absent.
I’m sceptical that anyone really _should_ be using generative AI for anything where correctness matters at all, but spreadsheets in particular seem close to a worst-case scenario.
…without validating the results. Otherwise why should we ever use LLMs for anything?