At 9:08 AM -0400 6/4/02, Wietse Venema wrote: >The proper approach is to eliminate such ambiguity, by normalizing >data, that is, by transforming messages into a form that avoids >all the grey areas where implementations err, or where RFCs are >ambiguous. Which is non-trivial, and also runs the risk of taking things that passed a scanner and turning them into something dangerous. The old adage for standards of "make your output conform strictly, but be lenient in what you accept" simply isn't appropriate for a secure environment. Microsoft has played very fast and loose with what their software accepts (backslashes in URLs, mis-typed MIME files that have their type determined by content...) and we are all dealing with the consequences. That model worked well when the input was from a user. It does not work well when the input is from servers (which can be corrected) and untrusted sources (which should be rejected). I would go the other route with a scanner/interpreter. If the input doesn't match your understand of the standard--reject it. Actually, I was going to say, "or turn it into plain text", but there again we run into the problem of software which is overly happy to interpret what the remote sender "meant". I really don't think there's any other safe solution. Of course politically, if what you are rejecting is output by some major vendor--you've got a problem. -- Kee Hinckley - Somewhere.Com, LLC http://consulting.somewhere.com/ I'm not sure which upsets me more: that people are so unwilling to accept responsibility for their own actions, or that they are so eager to regulate everyone else's.