When the KVM config file parser generates the list of tests, it will generate a full list of dicts, each dict maps to a test to be executed. However, due to the design of our dependency system, we skip running tests that had a dependency failure. While fair, this also masks the fact that the tests that were not executed are indeed failures (test couldn't run because a dependency failed). So test jobs that had very serious problem (say, kvm build failed so every other test failed in sequence), will yield fairly reasonable PASS rates, that can fool developers. So, here's what we are going to do to solve this: * When a dependency fails, when it comes to execute a dependency test, don't just skip it. Execute it in a way that it will always throw a TestNA exception. In order to do that: * Introduce an extra parameter 'dependency_fail = yes' on the dependent test 'params' dict. * Make test load code to fail the test right away with TestNA whenever params[dependency_fail] is 'yes'. Changes from v2: * Move failing test code from kvm_preprocessing to kvm.py. Conceptually that logic belongs to the high level kvm test load code. * Fix typo on the TestNAError class name. Signed-off-by: Lucas Meneghel Rodrigues <lmr@xxxxxxxxxx> --- client/tests/kvm/kvm.py | 5 +++++ client/tests/kvm/kvm_preprocessing.py | 1 - client/tests/kvm/kvm_utils.py | 6 +++++- 3 files changed, 10 insertions(+), 2 deletions(-) diff --git a/client/tests/kvm/kvm.py b/client/tests/kvm/kvm.py index b88fd51..c22293a 100644 --- a/client/tests/kvm/kvm.py +++ b/client/tests/kvm/kvm.py @@ -27,6 +27,11 @@ class kvm(test.test): # Convert params to a Params object params = kvm_utils.Params(params) + # If a dependency test prior to this test has failed, let's fail + # it right away as TestNA. + if params("dependency_failed") == 'yes': + raise error.TestNAError("Test dependency failed") + # Report the parameters we've received and write them as keyvals logging.debug("Test parameters:") keys = params.keys() diff --git a/client/tests/kvm/kvm_preprocessing.py b/client/tests/kvm/kvm_preprocessing.py index 515e3a5..dbe5d19 100644 --- a/client/tests/kvm/kvm_preprocessing.py +++ b/client/tests/kvm/kvm_preprocessing.py @@ -194,7 +194,6 @@ def preprocess(test, params, env): @param env: The environment (a dict-like object). """ error.context("preprocessing") - # Start tcpdump if it isn't already running if "address_cache" not in env: env["address_cache"] = {} diff --git a/client/tests/kvm/kvm_utils.py b/client/tests/kvm/kvm_utils.py index 5ecbd4a..ff9ee17 100644 --- a/client/tests/kvm/kvm_utils.py +++ b/client/tests/kvm/kvm_utils.py @@ -1173,7 +1173,11 @@ def run_tests(parser, job): if not current_status: failed = True else: - current_status = False + # We will force the test to fail as TestNA during preprocessing + dict['dependency_failed'] = 'yes' + current_status = job.run_test("kvm", params=dict, tag=test_tag, + iterations=test_iterations, + profile_only= bool(profilers) or None) status_dict[dict.get("name")] = current_status return not failed -- 1.7.4.2 -- To unsubscribe from this list: send the line "unsubscribe kvm" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html