[PATCH 2/4] KVM test: Fail a test right away if 'dependency_fail = yes' is on params

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



When the KVM config file parser generates the list of tests,
it will generate a full list of dicts, each dict maps to a
test to be executed. However, due to the design of our
dependency system, we skip running tests that had a dependency
failure.

While fair, this also masks the fact that the tests that were
not executed are indeed failures (test couldn't run because a
dependency failed). So test jobs that had very serious problem
(say, kvm build failed so every other test failed in sequence),
will yield fairly reasonable PASS rates, that can fool developers.

So, here's what we are going to do to solve this:

 * When a dependency fails, when it comes to execute a dependency
test, don't just skip it. Execute it in a way that it will always
throw a TestNA exception.

In order to do that:

 * Introduce an extra parameter 'dependency_fail = yes' on the
dependent test 'params' dict.
 * Make test preprocessing code to fail the test right away with
TestNA whenever params[dependency_fail] is 'yes'.

Signed-off-by: Lucas Meneghel Rodrigues <lmr@xxxxxxxxxx>
---
 client/tests/kvm/kvm_preprocessing.py |    6 +++++-
 client/tests/kvm/kvm_utils.py         |    6 +++++-
 2 files changed, 10 insertions(+), 2 deletions(-)

diff --git a/client/tests/kvm/kvm_preprocessing.py b/client/tests/kvm/kvm_preprocessing.py
index 515e3a5..47c29d4 100644
--- a/client/tests/kvm/kvm_preprocessing.py
+++ b/client/tests/kvm/kvm_preprocessing.py
@@ -193,8 +193,12 @@ def preprocess(test, params, env):
     @param params: A dict containing all VM and image parameters.
     @param env: The environment (a dict-like object).
     """
-    error.context("preprocessing")
+    # If a dependency test prior to this test has failed, then let's 'run' this
+    # test, but fail it right away as TestNA.
+    if params("dependency_failed") == 'yes':
+        raise error.TestNA("Test dependency failed")
 
+    error.context("preprocessing")
     # Start tcpdump if it isn't already running
     if "address_cache" not in env:
         env["address_cache"] = {}
diff --git a/client/tests/kvm/kvm_utils.py b/client/tests/kvm/kvm_utils.py
index 5ecbd4a..ff9ee17 100644
--- a/client/tests/kvm/kvm_utils.py
+++ b/client/tests/kvm/kvm_utils.py
@@ -1173,7 +1173,11 @@ def run_tests(parser, job):
             if not current_status:
                 failed = True
         else:
-            current_status = False
+            # We will force the test to fail as TestNA during preprocessing
+            dict['dependency_failed'] = 'yes'
+            current_status = job.run_test("kvm", params=dict, tag=test_tag,
+                                          iterations=test_iterations,
+                                          profile_only= bool(profilers) or None)
         status_dict[dict.get("name")] = current_status
 
     return not failed
-- 
1.7.4.2

--
To unsubscribe from this list: send the line "unsubscribe kvm" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html


[Index of Archives]     [KVM ARM]     [KVM ia64]     [KVM ppc]     [Virtualization Tools]     [Spice Development]     [Libvirt]     [Libvirt Users]     [Linux USB Devel]     [Linux Audio Users]     [Yosemite Questions]     [Linux Kernel]     [Linux SCSI]     [XFree86]
  Powered by Linux