On 7/31/23 13:00, Phillip Hallam-Baker wrote:
No, they aren't. Cryptography is binary but stopping pedophiles is not.
There's a *lot* of fuzz around what CSAM is. One of the many
problems with detecting CSAM is that it requires knowledge of
things that aren't present in the image, like the precise date at
which the image was taken and the birthday(s) of the subject(s)
involved. Outside of very wide parameters, this cannot be
reliably determined by a human or an AI looking at the image, and
any attempt to do so will result in large numbers of false
positives and negatives.
The FBI has a database of such images with documentation to be
used as evidence against those accused of CSAM possession. But
for obvious reasons they don't want to release those images.
And having some oracle in software that compares CSAM against such
a database, even if the images can't be directly extracted,
potentially enables CSAM distributors and producers, because AI
can then alter those images until they pass the oracle.
Everyone needs to understand that a likely effect of any CSAM
countermeasure is to increase the distribution and production of
CSAM, and with it the number of victims. It's absolutely
essential to look beyond the intentions and analyze the potential
and likely effects.
Keith