> That's not actually correct. Most network protocols use the > "Distinguished Encoding Rules" (DER) not the "Basic Encoding Rules" > (BER). BER is an abomination and should never, ever have been in > the standard; the only protocol commonly used over IP that uses BER > is LDAP, because it descends from DAP, which used BER. > > So you can't reasonably assume that if it uses ASN.1, it uses > BER. That's presumably why Microsoft left certain ASN.1-using > network services turned on. DER is not a transfer encoding syntax. It's purpose is to give the same sequence of octets for a given abstract value (as does CER - Canonical Encoding Rules). LDAP, OSI, SNMP etc. all use BER, albeit with restriction in some cases (e.g. no use of indefinite length encoding). The cryptographic message syntax (RFC 3369) requires DER for some elements, but specifically allows BER overall. [OSI can use PER - Packed Encoding Rules - but that is a whole different ballgame]. DER is used for signing, not for transfer. I.e. if you get an ASN.1 value in protocol (whatever transfer syntax) you need to re-encode the value (paying attention to the ASN.1 type) as DER to generate the hash. In any case DER is a subset of BER (as is CER). The particular vulnerability in question arises in the encoding of primitive elements, where DER and BER are the basically the same (except that with DER you are required to specify the length field in the minimum amount of space, which is not the case for BER). The bottom line is: given a PDU or object encoded in DER, you would use a BER decoder on it. I'm not sure what the issue is which causes BER to be an "abomination" while DER is OK. cheers -- David Wilson <David.Wilson@isode.com> Isode Limited