I am treating this issue as closed, unless someone wants to proposed language changes based on John's analysis below. EH > -----Original Message----- > From: John Bradley [mailto:ve7jtb@xxxxxxxxxx] > Sent: Thursday, March 08, 2012 8:13 AM > To: Eran Hammer > Cc: Julian Reschke; ietf@xxxxxxxx; The IESG; oauth@xxxxxxxx > Subject: Re: [OAUTH-WG] Last Call: <draft-ietf-oauth-v2-bearer-15.txt> (The > OAuth 2.0 Authorization Protocol: Bearer Tokens) to Proposed Standard > > Thanks that is better. > > Without knowing the lifetime of the token these are per guess probabilities. > Effectively 128bits for a random value and 256bits for a HMAC or other > signature. > > For tokens intended for handling by end-users it may be useful to give some > advice. > In general you don't want an attacker having more than a one in 2^14 chance > of guessing a valid code for a AS during the lifetime of the code(NIST LoA 2). > > For a code randomly generated from a 94 character code set 4 characters > gets you 26.3 bits of entropy. > 5 characters requires limiting an attacker to 2^12.3 (5,042) guesses per token > lifetime. > > For a code randomly generated from a 94 character code set 5 characters > gets you 32.9 bits of entropy. > 5 characters requires limiting an attacker to 2^18.9 (489,178) guesses per > token lifetime. > > If the token is single use and the client uses it right away that is easy, > however in a worst case scenario the token might live 10min? > That would be 8.4 attempts per second as a max for a 4 character code or 815 > per second for 5 characters. > > That is all way too much to explain however I would recommend as a general > rule: > > Credentials intended for handling by end users SHOULD be a minimum of 5 > randomly generated charters from a set of 94 or otherwise contain a > minimum entropy of 2^32.9. > > That is probably high enough that the AS will notice an attack, lower entropy > may pass under the radar. > Also the chances of an attacker being successful go up proportionally to the > number simultaneous codes in flight at any point (it becomes a non targeted > attack). > > It isn't something that I will loose sleep over, It gives me something else to > profile:) > > Thanks > John B. > > On 2012-03-07, at 8:18 PM, Eran Hammer wrote: > > > New text: > > > > The probability of an attacker guessing generated tokens (and other > credentials not > > intended for handling by end-users) MUST be less than or equal to > 2^(-128) and SHOULD be > > less than or equal to 2^(-160). > > > > Removed reference to RFC 1750. > > > > EH > > > >> -----Original Message----- > >> From: John Bradley [mailto:ve7jtb@xxxxxxxxxx] > >> Sent: Monday, February 06, 2012 5:07 PM > >> To: Eran Hammer > >> Cc: Julian Reschke; ietf@xxxxxxxx; The IESG; oauth@xxxxxxxx > >> Subject: Re: [OAUTH-WG] Last Call: <draft-ietf-oauth-v2-bearer-15.txt> > (The > >> OAuth 2.0 Authorization Protocol: Bearer Tokens) to Proposed Standard > >> > >> RE new text in Draft 23 > >> > >> http://tools.ietf.org/html/draft-ietf-oauth-v2-23#section-10.10 > >> > >> Generated tokens and other credentials not intended for handling by > >> end-users MUST be constructed from a cryptographically strong random > >> or pseudo-random number sequence ([RFC1750]) generated by the > >> authorization server. > >> > >> Given that many implementations may elect to use signed tokens, such as > >> SAML or JWT (JOSE) this should not be a MUST. > >> > >> Giving people sensible defaults such as the probability of an attacker > >> guessing a valid access token for the protected resource should be less > than > >> 2^(-128). > >> > >> The probability of generating hash colisions randomly is a odd metric, 2^(- > >> 128) for a SHA256 as I recall. > >> Many factors play into what is secure, token lifetime etc. > >> > >> I don't mind some reasonable defaults but adding a requirement for > >> unstructured tokens is a bit much. > >> > >> Regards > >> John B. > >> > >> > >