I know people who work on CSAM and while they are uniformly working
hard to fight it, they do tend to a degree of tunnel vision and
unfortunate assumptions that anyone who makes it harder for them to do
their jobs is ignorant or malicious. And then there's the "nerd harder
and give us a back door only good people can use" stuff.
Speaking of CSAM, there is this EU proposal for regulation to detect
Child Sexual Abuse Material.
A summary (from Prof. Bart Prennel[0])
A brief summary:
- the approach is technically infeasible today and no path can be
imagined that can make it feasible;
- as conceived, it will lead to a massive number of false positives,
resulting in false accusations of serious crimes and very high costs;
- serious criminals will find it easy to bypass it;
- the proposal presents substantial risks for function creep & abuse by
non-democratic regimes to go after regime-critical content;
- the proposal puts the security of our digital society at risk by
undermining end-to-end encryption;
- the proposal will have a chilling effect on online communication;
- the proposal violates basic human rights and is not proportional.
More productive approaches exist: it should be made easier to report
abuse and local social services should be strengthened.
Prof. Prennel and others have organized an open letter to the EU
explaining why it is bad idea [1]. He delivered a keynote speech at
EURO S&P conference [2], and I hope his video and slides will be made
available soon.
[0]
https://www.linkedin.com/feed/update/urn:li:activity:7081940917808459777/
[1]
https://docs.google.com/document/d/13Aeex72MtFBjKhExRTooVMWN9TC-pbH-5LEaAbMF91Y/edit
[2] https://eurosp2023.ieee-security.org/
/giovane
(no hats)