You absolutely have a point. But for me, I'm not sure how to balance privacy and safety. Is my service really private at all if I'm handing off user files to a third party to do who knows what to scan for bad content, and potentially risk users through false positives?
Edit: A local model could work, but that can be quite compute intensive and therefore expensive.
There's no balance to be had--you must prioritize legality over privacy. You will be storing CSAM if you don't do something. You may already be storing CSAM. This is no joke. This is real and something every image hosting site deals with. You need to take it seriously. This is a "you could go to jail" concern, not a "this project might not work out" concern. The ability to store and share media privately while knowing it won't be scanned for abuse, with a free tier that doesn't even require an email address to sign up, is begging to be used for CSAM and other illegal activities. That's the sort of site you'd set up if your explicit goal was to attract CSAM. MEGA offers a similar service and they are severely burdened with abuse.
I meant it only for the reported content so that is, to me, a proper balance because that's kind of your legal requirement[0] to take down content which is reported. But since that's ripe for abuse the proper way is to basically first hide the content, review+confirm it's bad, and then take proper action.
So I would try asking around or thinking of how best to handle the specific reported cases without exposing yourself too directly.
Edit: A local model could work, but that can be quite compute intensive and therefore expensive.