These harmless generated images have a neuralhash equivalent to those provided in the NCMEC database submitted for testing.
I repeat: Dont upload these harmless images to iCloud as Apple will assume its Child Porn (CSAM).
Scripts were available on a GitHub repo but were removed because they may cause damage to others.
Based on the documentation from Apple, they are waiting to get *several* matches, *not only one* (we don't know what is *several* but I don't expect something like <= 3 pictures).
Once the rate has been reached, they ask to a physical team to review the "positive matches", and deliberate if, yes or no, the images are CSAM or not.
If yes, after the manual process, the authorities are called.