Hi all, I'd like to announce the availability of 'stompy', a free tool to perform a fairly detailed black-box assessment of WWW session identifier generation algorithms. Session IDs are commonly used to track authenticated users, and as such, whenever they're predictable or simply vulnerable to brute-force attacks, we do have a problem. [ The reason I'm cc:ing BUGTRAQ is that this tool already revealed several new, potential weaknesses in application platforms, and can be readily used to find more - for example, it is my impression that BEA WebLogic and Sun Java System Web Server both have problems with their JSESSIONIDs [1]; proprietary solutions by some of the larger portals / e-commerce sites didn't always earn a passing grade, either. ] Why bother? =========== Some session ID cookie generation mechanisms are well-studied and well-documented, and believed to be cryptographically secure (example: Apache Tomcat, PHP, ASP.NET builtins). This is not necessarily so for certain less researched enterprise web platforms - and almost never so for custom solutions that are frequently implemented inside the web application itself. Yet, while there are several nice GUI-based tools designed to analyze HTTP cookies for common problems (Daves' WebScarab, SPI Cookie Cruncher, Foundstone CookieDigger, etc), they all seem to rely on very trivial, if any, tests when it comes to unpredictability ("alphabet distribution" or "average bits changed" are top shelf); this functionality is often not better than a quick pen-and-paper analysis, and can't be routinely used to tell a highly vulnerable linear congruent PRNG (rand()) from a well-implemented MD5 hash system (/dev/urandom). As far as I can tell, today's super-bored pen-testers can at best collect data by hand, determine its encoding, write conversion scripts, and then feed it to NIST Statistical Test Suide or alike - but few will. What's cool? ============ In order to have a fully automated, hands-off tool to reliably detect anomalies that are not readily apparent at a first glance, I devised an utility that: - Automatically finds session IDs encoded as URLs, cookies, and in form inputs, then collects a statistically significant sample of data, - Determines alphabet structure to transparently handle base64, uuencode, base32, hex, and any other sane encoding scheme without user intervention, - Translates the data to isolated time-domain bitstreams to examine how SID bits at each position change in time, - Runs a suite of FIPS-140-2 PRNG evaluation tests on the sample, - Runs an array of n-dimensional phase space tests to find deterministic correlations, PRNG hyperplanes, etc, etc. Of course, the tool cannot prove the correctness of an implementation, and it is possible to devise predictable, cryptographically unsafe PRNGs that would pass these tests; still, the tool can find plenty of problems and oddities. Well, that's it. For more, see the included README file. The application, in a fairly decent shape (not a wobbly PoC) and tested under Linux, FreeBSD, and CYGWIN, can be downloaded here: http://lcamtuf.coredump.cx/stompy.tgz Cheers, /mz [1] BEA Weblogic test output: http://lcamtuf.coredump.cx/BEA.log; in response to WebScarab analysis, BEA stated some time ago that the beginning of the identifier might be deterministic at MSB positions: http://dev2dev.bea.com/blog/neilsmithline/archive/2006/03/jsessionid_valu_1.html ...but 'stompy' output seems to clearly indicate that all the data exhibits strong biases, irregularities, and correlation patterns, and as such, the randomness of their "very large random number" is questionable at best. .