[NOT A PATCH] Question on regression by bug fixes

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi Paul

I couldn't follow the reasoning around the following _artificial_ hunk.

diff --git a/formal/regression.tex b/formal/regression.tex
index 29cb787..9831b9d 100644
--- a/formal/regression.tex
+++ b/formal/regression.tex
@@ -387,6 +387,7 @@ To see this, keep in mind that on average, every six fixes introduces
 a bug.
 Therefore, fixing the 24 bugs, which had a combine mean time to failure
 of about 40,000 years, will introduce three more bugs.
+???
 These three bugs most likely fail more often than once per 13,000 years,
 so the reliability of the software has decreased.

Where did the "once per 13,000 years" come from?
13,000 was derived from 40,000/3?

But in this argument, original 24 bugs are fixed, and 3 new bugs are introduced.
We have no idea what failure rate the new bugs would have, don't we???

What am I missing?

    Thanks, Akira
--
To unsubscribe from this list: send the line "unsubscribe perfbook" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html



[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Linux NFS]     [Linux NILFS]     [Linux USB Devel]     [Video for Linux]     [Linux Audio Users]     [Yosemite News]     [Linux Kernel]     [Linux SCSI]

  Powered by Linux