On Wed, 19 Jul 2023 at 04:49, Rae Moar <rmoar@xxxxxxxxxx> wrote: > > On Tue, Jul 18, 2023 at 3:39 AM David Gow <davidgow@xxxxxxxxxx> wrote: > > > > On Sat, 8 Jul 2023 at 05:10, Rae Moar <rmoar@xxxxxxxxxx> wrote: > > > > > > Add documentation on the use of test attributes under the section "Tips for > > > Running KUnit Tests" in the KUnit docs. > > > > > > Documentation includes three sections on how to mark tests with attributes, > > > how attributes are reported, and how the user can filter tests using test > > > attributes. > > > > > > Signed-off-by: Rae Moar <rmoar@xxxxxxxxxx> > > > --- > > > > Looks good overall. Some nitpicks below. > > > > Reviewed-by: David Gow <davidgow@xxxxxxxxxx> > > > > > > > > Changes since v1: > > > - This is a new patch > > > > > > .../dev-tools/kunit/running_tips.rst | 163 ++++++++++++++++++ > > > 1 file changed, 163 insertions(+) > > > > > > diff --git a/Documentation/dev-tools/kunit/running_tips.rst b/Documentation/dev-tools/kunit/running_tips.rst > > > index 8e8c493f17d1..c9bc5a6595d3 100644 > > > --- a/Documentation/dev-tools/kunit/running_tips.rst > > > +++ b/Documentation/dev-tools/kunit/running_tips.rst > > > @@ -262,3 +262,166 @@ other code executed during boot, e.g. > > > # Reset coverage counters before running the test. > > > $ echo 0 > /sys/kernel/debug/gcov/reset > > > $ modprobe kunit-example-test > > > + > > > + > > > +Test Attributes and Filtering > > > +============================= > > > + > > > +Test suites and cases can be marked with test attributes, such as speed of > > > +test. These attributes will later be printed in test output and can be used to > > > +filter test execution. > > > + > > > +Marking Test Attributes > > > +----------------------- > > > + > > > +Tests are marked with an attribute by including a ``kunit_attributes`` object > > > +in the test definition. > > > + > > > +Test cases can be marked using the ``KUNIT_CASE_ATTR(test_name, attributes)`` > > > +macro to define the test case instead of ``KUNIT_CASE(test_name)``. > > > + > > > +.. code-block:: c > > > + > > > + static const struct kunit_attributes example_attr = { > > > + .speed = KUNIT_VERY_SLOW, > > > + }; > > > + > > > + static struct kunit_case example_test_cases[] = { > > > + KUNIT_CASE_ATTR(example_test, example_attr), > > > + }; > > > + > > > +.. note:: > > > + To mark a test case as slow, you can also use ``KUNIT_CASE_SLOW(test_name)``. > > > + This is a helpful macro as the slow attribute is the most commonly used. > > > + > > > +Test suites can be marked with an attribute by setting the "attr" field in the > > > +suite definition. > > > + > > > +.. code-block:: c > > > + > > > + static const struct kunit_attributes example_attr = { > > > + .speed = KUNIT_VERY_SLOW, > > > + }; > > > + > > > + static struct kunit_suite example_test_suite = { > > > + ..., > > > + .attr = example_attr, > > > + }; > > > + > > > +.. note:: > > > + Not all attributes need to be set in a ``kunit_attributes`` object. Unset > > > + attributes will remain uninitialized and act as though the attribute is set > > > + to 0 or NULL. Thus, if an attribute is set to 0, it is treated as unset. > > > + These unset attributes will not be reported and may act as a default value > > > + for filtering purposes. > > > + > > > +Reporting Attributes > > > +-------------------- > > > + > > > +When a user runs tests, attributes will be present in kernel output (in KTAP > > > +format). This is an example of how test attributes for test cases will be formatted > > > +in Kernel output: > > > + > > > +.. code-block:: none > > > + > > > + # example_test.speed: slow > > > + ok 1 example_test > > > + > > > +This is an example of how test attributes for test suites will be formatted in > > > +Kernel output: > > > + > > > +.. code-block:: none > > > + > > > + KTAP version 2 > > > + # Subtest: example_suite > > > + # module: kunit_example_test > > > + 1..3 > > > + ... > > > + ok 1 example_suite > > > + > > > > Maybe worth noting that kunit.py will hide these for passing tests by > > default, and --raw_output is needed to see them? > > > > I will definitely add this in. If attributes are popular in the > future, I could create a future patch to show attributes in the parser > output as well. Yeah, that could definitely be useful as a follow-up patch. > > > +Additionally, users can output a full attribute report of tests with their > > > +attributes, using the command line flag ``--list_tests_attr``: > > > + > > > +.. code-block:: bash > > > + > > > + kunit.py run "example" --list_tests_attr > > > + > > > +.. note:: > > > + This report can be accessed when running KUnit manually by passing in the > > > + module_param ``kunit.action=list_attr``. > > > + > > > +Filtering > > > +--------- > > > + > > > +Users can filter tests using the ``--filter`` command line flag when running > > > +tests. As an example: > > > + > > > +.. code-block:: bash > > > + > > > + kunit.py run --filter speed=slow > > > + > > > + > > > +You can also use the following operations on filters: "<", ">", "<=", ">=", > > > +"!=", and "=". Example: > > > + > > > +.. code-block:: bash > > > + > > > + kunit.py run --filter "speed>slow" > > > + > > > +This example will run all tests with speeds faster than slow. Note that the > > > +characters < and > are often interpreted by the shell, so they may need to be > > > +quoted or escaped, as above. > > > + > > > +Additionally, you can use multiple filters at once. Simply separate filters > > > +using commas. Example: > > > + > > > +.. code-block:: bash > > > + > > > + kunit.py run --filter "speed>slow, module=kunit_example_test" > > > + > > > +.. note:: > > > + You can use this filtering feature when running KUnit manually by passing > > > + the filter as a module param: ``kunit.filter="speed>slow, speed<=normal"``. > > > + > > > +Filtered tests will not run or show up in the test output. You can use the > > > +``--filter_skip`` flag to skip filtered tests instead. These tests will be > > > +shown in the test output in the test but will not run. To use this feature when > > > +running KUnit manually, use the ``kunit.filter`` module param with > > > +``kunit.filter_action=skip``. > > > + > > > +Rules of Filtering Procedure > > > +---------------------------- > > > + > > > +Since both suites and test cases can have attributes, there may be conflicts > > > +between attributes during filtering. The process of filtering follows these > > > +rules: > > > + > > > +- Filtering always operates at a per-test level. > > > + > > > +- If a test has an attribute set, then the test's value is filtered on. > > > + > > > +- Otherwise, the value falls back to the suite's value. > > > + > > > +- If neither are set, the attribute has a global "default" value, which is used. > > > + > > > +List of Current Attributes > > > +-------------------------- > > > > I wonder whether this should end up part of the KTAP spec (or as an > > appendix/supplement to it). Or even as a separate page within the > > KUnit documentation to avoid running_tips.rst from getting too huge. > > I am a bit hesitant to move this as part of the KTAP spec in case > there will exist KTAP attributes/data that are not supported by the > KUnit test attributes framework (could be runtime specific attributes > that use a different framework?). This is probably something best worked out as part of the KTAP spec process. Either attribute names are a free-for-all (albeit hopefully one where there are some documented 'common' attributes), or we need some sort of namespacing between "General KTAP attributes", "KUnit-specific attributes", "subsystem-specific attributes", "totally-made-up-on-the-spot attributes", etc. e.g., email headers have a list of 'standard' ones, but anyone can add their own as long as it starts with 'X-'. Or OpenGL extensions are always of the form GL_blah_blah_blah_<vendor> (where vendor is the code for the company that proposed it, or EXT or ARB for those which have been agreed upon by everyone). > However, I do worry about the size of this page. Do you think that I > should move all of the attributes to a new documentation page? While I don't think it's a problem with only two attributes, it'd probably be the more futureproof thing to do. That being said, maybe we wait until there's a decision on the KTAP side? Up to you. > > > > > + > > > +``speed`` > > > + > > > +This attribute indicates the speed of a test's execution (how slow or fast the > > > +test is). > > > + > > > +This attribute is saved as an enum with the following categories: "normal", > > > +"slow", or "very_slow". The assumed default speed for tests is "normal". This > > > +indicates that the test takes a relatively trivial amount of time (less than > > > +1 second), regardless of the machine it is running on. Any test slower than > > > +this could be marked as "slow" or "very_slow". > > > > Is it worth noting that "KUNIT_CASE_SLOW()" can be used to easily set > > this to slow? > > This definitely seems important to add. I will add this to the documentation. > > > > > > > > + > > > +``module`` > > > + > > > +This attribute indicates the name of the module associated with the test. > > > + > > > +This attribute is automatically saved as a string and is printed for each suite. > > > +Tests can also be filtered using this attribute. > > > + > > > -- > > > 2.41.0.255.g8b1d071c50-goog > > > > > > > > > Error: new blank line at EOF. > > Oops. I will change this.
Attachment:
smime.p7s
Description: S/MIME Cryptographic Signature