On 8/2/2023 4:53 PM, John Levine wrote:
It appears that Keith Moore <moore@xxxxxxxxxxxxxxxxxxxx> said:
-=-=-=-=-=-
On 7/31/23 13:00, Phillip Hallam-Baker wrote:
No, they aren't. Cryptography is binary but stopping pedophiles is not.
There's a *lot* of fuzz around what CSAM is. One of the many problems
with detecting CSAM is that it requires knowledge of things that aren't
present in the image, like the precise date at which the image was taken
and the birthday(s) of the subject(s) involved. ...
While that is technically true, it is a red herring. People I know
who deal with CSAM tell me that the stuff they are concerned with
is small children having horrible things done do them.
I realize there are opportunistic politicians freaking out about teens
who send each other nude selfies but (disregarding the somewhat
separate issue of revenge porn) that's not the problem.
Everyone needs to understand that a likely effect of any CSAM
countermeasure is to increase the distribution and production of CSAM,
and with it the number of victims.
Um, what?
That's the general tit-for-tat between offense and defense. We have seen
it with spam: better spam detection begets smarter hiding of spam, and
vice versa. The same is very likely to happen with CSAM. If a CSAM
scanning technology is applied at control points, the criminals who
profit from distributing this material would probably find some way to
tweak their materials and evade that particular technology, which will
evolve, etc. It will not stop if only a fraction of the criminals and
their audience are caught, because there will always remain a
substantial fraction in the wild to evolve their methods.
-- Christian Huitema