Exceptions make it considerably harder to reason about state by reading the program text. As the notion that programmers should have some actual understanding of what they write slowly becomes less unfashionable, language features that make understanding code needlessly harder are losing some of their appeal even though they speed up writing the code.
What really makes code hard to read is having multiple paths to disentangle. There is one little error deep in the call stack but you have to vandalize the 10 functions above it in the call stack to carefully separate the error and non-error paths -- what's the probability that you will end up cleaning up properly in both paths when it isn't done for you with finally? What's the probability that somebody looking at this code is really going to find the subtle error in the error path or an error in the happy path caused and hidden by the complexity of the unhappy path?
I think the first C program I saw was a type-in terminal emulator from Byte magazine around 1985 and I was struck by the akwardness of the error handling in the C stdlib, spent a lot of time looking at the code when I realized the author had "spaced it" at one point such that the error handling was wrong and thought "this sucks" but learned how to write C programs with 3x the LOC because of all the alternate paths I had to put in to handle errors.
When I saw exceptions for the first time I felt strongly liberated because I got for free what I was working for so hard in C so I got to spend more time thinking about algorithms, the needs of the customer, things like that.