Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

These harmless generated images have a neuralhash equivalent to those provided in the NCMEC database submitted for testing. I repeat: Dont upload these harmless images to iCloud as Apple will assume its Child Porn (CSAM). Scripts were available on a GitHub repo but were removed because they may cause damage to others.


Is the hash database and the hashing algo public? Or how do you know these match?


Of course the database is public, it's on every iPhone!


No, it’s not.


> Scripts were available on a GitHub repo but were removed because they may cause damage to others.

Is there an archived link?

Edit: I guess this? https://gist.github.com/unrealwill/c480371c3a4bf3abb29856c29...


> Dont upload these harmless images to iCloud as Apple will assume its Child Porn (CSAM)

This is not true. They may match the hash, but the will not match the visual derivative.

The system is not as easily fooled as you think.


We've learned from YouTube how well matching content works well. Apple will be better right? Right?


Yes. We know that it is based on the papers explaining how it works.


> The system is not as easily fooled as you think.

I would like to believe that is true, but the negative consequences of even generating a false-positive is enough to not attempt to upload any image.


I tend to disagree here...

Based on the documentation from Apple, they are waiting to get *several* matches, *not only one* (we don't know what is *several* but I don't expect something like <= 3 pictures). Once the rate has been reached, they ask to a physical team to review the "positive matches", and deliberate if, yes or no, the images are CSAM or not.

If yes, after the manual process, the authorities are called.


Hypothetically, what happens if a viral event should persuade people to mass upload these images? Would Apple ad hoc modify their review protocol?


Nothing because these files won’t trigger a match.


you mean like how people got so fed up with ToS-mandated arbitration that they all decided to file motions simultaneously

it worked that time...


The consequence is that someone at Apple reviews the case, notices that its a false positive, and closes the case.


If Google Drive scans with the same database then how is your link working?


Because they are scanning with a different hash system.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: