On Aug 3, 2007, at 11:24 AM, Dave Crocker wrote:
My point was about the failure to make sure there was large-scale,
multi-vendor, in-the-wild *service*. Anything that constraint [in]
what can go
wrong will limit the ability to make the technology robust and usable.
There are currently millions of unconstrained large-scale, in-the-
wild services being manipulated and controlled by criminals.
Constraints that must be taken seriously are related to the economies
limiting the staging of DDoS attacks. Criminals often utilize tens
of thousands of 0wned systems. These systems often send large email
campaigns. Any scheme that expects receipt of a message to invoke a
process that initiates additional traffic must be carefully considered.
Expecting recipients to employ the local-part of a purported
originator's email-address to then construct dozens or even hundreds
of DNS transactions wholly unrelated to the actual source of the
message is a prime example of how economies related to DDoS attacks
are being gravely shifted in the wrong direction.
Spammer are already spamming, either directly or through an ISP's
outbound server. Lacking a reasonable constraint on the recipient
process, criminals can attack without expending their resources and
not expose the location of their systems. Using SPF/Sender-ID as an
example, just one DNS resource record is able to source an attack
comprised of millions of recipient generated DNS transactions. The
lack of receipt process constraint in this case is an example of
either negligence or incompetence. Here an attack may employ several
levels of indirection and yet nothing germane to the attack will be
found within email logs.
Not imposing reasonable constraints does not make the Internet either
more robust or usable.
-Doug
_______________________________________________
Ietf@xxxxxxxx
https://www1.ietf.org/mailman/listinfo/ietf