On Sat, Dec 30, 2006 at 10:51:12AM +0100, Michael Schwendt wrote: > On Sat, 30 Dec 2006 00:30:17 +0100, Axel Thimm wrote: > > > On Sat, Dec 30, 2006 at 12:08:16AM +0100, Michael Schwendt wrote: > > > On Fri, 29 Dec 2006 23:30:09 +0100, Axel Thimm wrote: > > > > > > > I don't think Callum suggests you to reduce to only these items on the > > > > checklist, it should be considered the basic items to check. After all > > > > they are called a MUST for a reason, e.g. supposedly *every review* > > > > has checked the MUST items, > > > > > > What is the purpose of listing them in the review then? > > > > Ensuring that reviewers get in touch with the checklist instead of ... > > Ouch. Deadlock. We're in a loop! Then please do find the exit, I'm already standing on the outside looking at you spinning in circles - I'm all dizzy already ;) > When the reviewer is forced to include a commented mandatory > incomplete checklist, this would require the reviewer to document > all additional checks (among them things more important than what's > in the checklist), too, for completeness. Why does it have to be all or nothing? So you either just stamp off a package review with a terse aproval notice or have to write a book on it? Try to find a middle ground, which is posting a checklist and anything else you want to post. Other people have done this successfully, check their reviews. > hereby refuse to do that and will rather stop doing reviews completely. > I do custom reviews and adapt to what is contained within a package, ^^^^^^^^^^^^^^ > and more often than not that has helped in blocking crap. Thanks for putting efforts into allowing good packages to evolve, but any custom or packaging habits controlled reviewes need to be on top of the base checklist. Otherwise the reviews will differ in quality too much. Someone else may think that according to his packaging habits it is enough to build and run the package and stamp it in the same way you do. How will one see the difference in the quality of the review? > > > APPROVAL => all MUST items must have passed the check > > > > ... using the easy way out. > > No, there is no excuse if the approved package does not pass the checklist > actually. And how will one be able to tell? Only by doing himself a complete review ... > > > > and listing them in the review with a check after them signals > > > > that you indeed are following the very basic QA requirements. > > > > > > How do you know whether it's not just a single cut'n'paste job? > > > > I don't, and I know that even less when there's a one-liner "APPROVED" > > in the bugzilla entry. > > Then it's pointless. Well, nothing in bugzilla has been entered under the influence of a truth serum, we don't declare it pointless for that matter. You do tend to exaggerate a bit in the sense of "All or nothing". > > > The only interesting point is when after approval it turns out that the > > > reviewer has NOT checked something and has NOT noticed one or more flaws > > > that should have been noticed when processing the MUST items. > > > > Better be proactive than finding whom to blame afterwards: Forcing the > > reviewer to interact with the checklist make it less likely for missed > > items especially when compared to "wild reviews". > > No, thank you. This is a big turn-off criterion for me. When I say "APPROVED", > all that matters is whether anybody can point me to something I've missed. And "quid custodiet ipsos custodes?". Reviewers aren't gods, they are on the same level as contributors, and when we ask contributors to invest time in packaging and explaining package decisions, we have to ask reviewers to put some visible efforts into the process, too. All I know by now from your responses is o that you never check against changes in the packaging guidelines o neither against changes in the review process o use "custom reviewing" according to your own "packaging habits" So it looks quite certain that you will miss MUSTs from the packaging quideline or the review list, and guess what: One cannot even check whether your reviewes are complete, because there is no trace of them other than a one-liner APPROVAL stamp. Therefore it makes very much sense to maintain a basic checklist for o ensuring that the review really happened with the minimal Fedora standards as given by the review procedure o be able to track bad reviews after the fact o make sure every reviewer has access to a fresh official checklist (with change dates attached as proposed by Jeff) -- Axel.Thimm at ATrpms.net
Attachment:
pgpKUm0nyWiLk.pgp
Description: PGP signature
-- Fedora-maintainers mailing list Fedora-maintainers@xxxxxxxxxx https://www.redhat.com/mailman/listinfo/fedora-maintainers
-- Fedora-maintainers-readonly mailing list Fedora-maintainers-readonly@xxxxxxxxxx https://www.redhat.com/mailman/listinfo/fedora-maintainers-readonly