>> The obvious conclusion is Apple will start to scan photos kept on device, even where iCloud is not used.
Wrong [1]. It's even in the first line of the document which you apparently didn't even read:
CSAM Detection enables Apple to accurately identify and report iCloud users who store
known Child Sexual Abuse Material (CSAM) in their iCloud Photos accounts
This doesn't mean I'm supporting their new "feature".
Wrong [1]. It's even in the first line of the document which you apparently didn't even read:
This doesn't mean I'm supporting their new "feature".1. https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...