Martin:
On Wed, Mar 20, 2019, at 22:50, Keith Moore wrote: We can either stick our heads in the sand, or try to have some foresight. Our foresight will certainly be imperfect, but it's probably better than willful blindness.
I agree with Keith's points here, but I don't think that the draft is demanding any more than what we can (and mostly already) do. Just a recognition that what we do isn't free of consequence and that we have a responsibility to engage with those consequences.
One cannot but agree with the abstracted point you have made. Unfortunately, that is not what the draft actually states (see my first email in this thread). I don't think that we've had a problem of willful ignorance or blindness.. I'd suggest that it's more a case of denial. Denial of the fact that there are wider consequences to our choices. More seriously, denial of the fact that ignoring them is a very particular way of engaging with those consequences.
A claim that others are in denial is either true or a discounting of considered experience and conclusions of people with different perspectives. Can you elaborate on why you believe it to be the former and not the latter?
Yes, we are better at making decisions when we can concentrate on technical matters, but any decisions we make will be more relevant when their greater context is considered in that process. My view is that anything less would be irresponsible.
That's not only because just because we as a community are primarily techies, but also because people have different priorities and concerns. It is hard to find consensus on our works when conflicting values have technical implications; and in those cases it should be hard.
As you implied above, this tussle is already necessarily playing out. A new normative document could, by design, place a finger on the scale, as I previously wrote, in such a way as to short circuit consideration of consequences, leading to the very irresponsible outcome you wish to avoid.
Eliot |