Actually I would think this information would be only as good as the person doing the testing, and in fact may lead to a false timeline. To continue using Mr. Litchfield's example consider the following: - The bugs (regardless of number) found in a day could have been blatantly obvious; - The bugs that took two weeks to find may have been more technically obscure, or it may be that Mr. Litchfield had other things to do rather than spend all his time looking for bugs; - From this, and previous postings, I am going to take for granted that Mr. Litchfield is an Oracle expert although we have never met to my knowledge. That being said, how long would it take a novice (or someone less skilled) to find these same bugs. I think even Mr. Litchfield would agree that there are malicious people out there just as expert, maybe even more so, than he is regarding Oracle products. - Level of effort also has to take into account when the research started versus when the application/patch/upgrade was released. For example let's say that 10gR2 was released on April 1st (don't actually know, just picking a date) and Mr. Litchfield was on vacation or travel until April 8th. If it then took him two weeks to find these bugs the 'bad guys' will have had a week headstart over his research. I understand that more people than Mr. Litchfield are doing this research but this would need to be factored in the equation. All this being said -- I am not taking the position that this information would not be 'interesting', but I don't thing it would "provide a more concrete answer to the question "how secure is software X." Thank You, Lee Kelly, CISSP -----Original Message----- From: Steven M. Christey [mailto:coley@xxxxxxxxx] Sent: Wednesday, May 10, 2006 6:29 PM To: davidl@xxxxxxxxxxxxxxx Cc: bugtraq@xxxxxxxxxxxxxxxxx Subject: Re: Oracle - the last word David Litchfield said: >When Oracle 10g Release 1 was released you could spend a day looking >for bugs and find thirty. When 10g Release 2 was released I had to >spend two weeks looking to find the same number. This increasing level of effort is likely happening for other major widely audited software products, too. It would be a very useful data point if researchers could publicly quantify how much time and effort they needed to find the issues (note: this is not my idea, it came out of various other discussions.) Level of effort might provide a more concrete answer to the question "how secure is software X?" Some researchers might not want to publicize this kind of information, but this would be one great way to help us move away from the primitive practice of counting the number of reported vulnerabilities. (and while I'm talking about quantifying researcher effort, it might be highly illustrative to measure how much time is spent in dealing with vendors during disclosure.) - Steve