> There was parents arrested over bath time and playing in the yard sprinklers, photos being processed at photo mats, will the same thing happen by apple mistakenly reporting parents?
No, because they’re not identifying content, they’re matching it against a set of already-known CSAM that NCMEC maintains. As you go on to say, telecoms and other companies already do this. Apple just advanced the state of the art when it comes to the security and privacy guarantees involved.
A set of unverified hashes that you hope only came from NCMEC. Telecoms do this on their own devices - not yours.
Apple just opened the door for constant searches of your digital devices. If you think it will stop at CSAM you have never read a history book - the single biggest user of UKs camera system originally intended for serious crimes are housing councils checking to see who didn't clean up after their dog.
> A set of unverified hashes that you hope only came from NCMEC. Telecoms do this on their own devices - not yours.
Yes, those are the main things we’re concerned about.
> Apple just opened the door for constant searches of your digital devices.
Specifically, it opens the door to them scanning content which is then end-to-end encrypted, which is the main problem.
I think the jury is out on whether this capability will be abused. Apple has said they will reject requests to use it for other purposes, but who really knows whether they will end up being forced to add hashes that aren’t CSAM?
I agree that both of these are potential problems.
> the single biggest user of UKs camera system originally intended for serious crimes are housing councils checking to see who didn't clean up after their dog.
Which camera system? Do you have a citation for that?
That's absolute nonsense but it's one of those things where I'd be interested to try and unpick the provenance of how someone could believe something so ridiculous.
Anecdotally my Mum works for Coventry City Council (though she is in events planning) but has noted complaints from colleagues about “busy work” from “fussy old people who keep asking for camera footage” — though Coventry often declines.
One or two news reports of local councils maybe using CCTV, doesn’t back up your claim.
The UK doesn’t have a super camera system used for minor crimes like you insinuate.
The high camera counts in the UK come from including private CCTV cameras in the data which privately owned and are not linked together, hence the government is not using a network of cameras to monitor dog poo clean up as you claim.
Don't know about UK but in France they are now using CCTV to fine not well parked delivery guys for a 2 minutes stop. While I agree vehicle parked anywhere can be a big inconvenience and deserve a fine, I don't think that's a big crime justifying deployment of such a surveillance system.
This totally makes sense, having one cop checking 100+ CCTV is far more efficient than a full team walking in the streets. Once you justified the cost on privacy and managed to deploy such system, it's so easy and convenient to use it for something else.
The UK government explicitly lays out a strategy for provate cameras to be bought and operated with mandatory rules for police access to footage. [1]
This is on top of the cameras that ARE owned by government entities - 18+ city councils [2]. And it's expanding [3].
Why do you think it matters if they are linked together? Retaining footage and handing it over to police on request (not warrant) is a requirement. The IPA allows collecting this information in bulk (eg from cctv providers) with warrants. [4]
I don't think any of your links remotely substantiate what you claimed ("the single biggest user of UKs camera system originally intended for serious crimes are housing councils checking to see who didn't clean up after their dog.").
What even is the "camera system originally intended for serious crimes"?
It's a lossy hash match though. If it wasn't, then subtly re-encoding the image would hide it. So they're definitely going to be mistakingly matching images.
No, because they’re not identifying content, they’re matching it against a set of already-known CSAM that NCMEC maintains. As you go on to say, telecoms and other companies already do this. Apple just advanced the state of the art when it comes to the security and privacy guarantees involved.