On Wed, 2004-01-21 at 18:41, Eric Rescorla wrote: > However, our > investigation does not support a substantial quality improvement--the > data does not allow us to exclude the possibility that the rate of bug > finding in any given piece of software is constant over long periods of > time. If there is little or no quality improvement, then we have no > reason to believe that that the disclosure of bugs reduces the overall > cost of intrusions. While I have argued against using CERT numbers as an indicator of security trends, I would point out that the number of vulnerabilities logged by the organization has (it looks like) fallen this year from the 2002 (based on three quarters of data). It is way too early to tell what this means, if anything. But assuming that a variety of variables are held constant (such as CERT's threshold for what constitutes a vuln, the number of apps audited by researchers/crackers, the amount of new code generated by developers this past year, etc.) then this could indicate that software quality has improved. Granted, it also likely means that less vulnerabilities have been found by current low- to medium-level of security auditing (i.e., the "bar" has been raised and the low-hanging fruit is all being picked). Researchers who look at more obscure flaws (Oulu University's investigation of ASN.1 issues is a good example), still seem to be finding enough of them. -R -- | robert lemos | senior staff writer | cnet news.com | | v: (415) 344-2000 | e: rob.lemos@cnet.com |