Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>> The obvious conclusion is Apple will start to scan photos kept on device, even where iCloud is not used.

Wrong [1]. It's even in the first line of the document which you apparently didn't even read:

  CSAM Detection enables Apple to accurately identify and report iCloud users who store
  known Child Sexual Abuse Material (CSAM) in their iCloud Photos accounts
This doesn't mean I'm supporting their new "feature".

1. https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...



nah. it means that they don’t scan it yet.

also by reading that doc and pointing to it means you trust apple.

i used to trust apple when they were peddling their privacy marketing stuff. not anymore.


OK so let's all lose our shit over things that haven't happened.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: