Re: Last Call: <draft-farrell-perpass-attack-02.txt> (Pervasive Monitoring is an Attack) to Best Current Practice

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 02/01/2014 06:12, John C Klensin wrote:
> Another message from the 11-12 December thread, then I'm going
> to try to crawl back out of this and get some work done...
> 
> --On Thursday, December 12, 2013 15:20 +1300 Brian E Carpenter
> <brian.e.carpenter@xxxxxxxxx> wrote:
> 
>>> If it is, then it would seem to call for "ubiquitous
>>> confidentiality" unless you are making a very fine point.

>> Indeed it is making a fine point - what it calls for is the
>> IETF to provide technological mechanisms that allow operators
>> and users to protect privacy. To what extent those mechanisms
>> are deployed is not under the IETF's control and will
>> presumably vary between countries.
> 
> Brian,
> 
> I'm sorry, but, beyond a certain point, that sets you (and us)
> up for a position that has an extremely poor ethical (and
> engineering) history.  The worst examples take us straight to
> various principles that have names but "we just invented this
> technology, you can't blame us for how it was used or its
> consequences" is perhaps the least negative of them of those
> examples.  

That might depend on how one feels about Mikhail Kalashnikov's defence:
("It is not my fault that the Kalashnikov was used in many troubled places.
I think the policies of these countries are to blame, not the designers.")
But I think you're over-interpreting my words. All IETF standards
are voluntary, so anything we specify that protects privacy is
also voluntary and we can't force people to use it.

> It is also, at least IMO, bad engineering because
> good engineering has to consider the entire constraint space and
> system, even if the constraints are economic or social and not
> just physics.

I can't disagree. Unfortunately the constraint space includes
jurisdictions with things like FISA courts or worse.

> Worse, we are already more than halfway into the sociopolitical
> side of the problem by even getting started in this discussion.
> Even there may be some associated technical problems and
> opportunities, privacy isn't a technical problem either.  More
> important, the expectation of privacy isn't a technical problem;
> we create a technical problem only when we assume that
> expectation and its reasonableness.   I happen to disagree with
> those who say "ok, it can be ignored" or "any expectation of
> privacy has become unreasonable" and assume you do too, but,
> when I complain about pain when I try to perform particular
> actions, my physician is fond of saying "so don't do that".  If
> one really has no expectation of privacy, then there is no
> technical or other problem with surveillance, pervasive or
> otherwise.  To go as far as we are going and then appeal to "not
> a technical problem" or, worse, ethical imperatives about the
> consequences of how our work is applied is, to be polite,
> disingenuous.

Yes. All I intended was to be realistic: whatever we specify,
we can't prevent people ignoring it. I think we have a lot of
running code proof of that.

    Brian




[Index of Archives]     [IETF Annoucements]     [IETF]     [IP Storage]     [Yosemite News]     [Linux SCTP]     [Linux Newbies]     [Fedora Users]