On Mon, Oct 05, 2020 at 13:41:52 +0200, Ján Tomko wrote: > On a Monday in 2020, Tim Wiederhake wrote: > > On Wed, 2020-09-30 at 14:11 +0200, Peter Krempa wrote: > > > On Wed, Sep 30, 2020 at 13:54:58 +0200, Tim Wiederhake wrote: > > > > On Tue, 2020-09-29 at 11:22 +0200, Peter Krempa wrote: > > > > > On Mon, Sep 21, 2020 at 15:07:32 +0200, Tim Wiederhake wrote: > > > > > > Signed-off-by: Tim Wiederhake <twiederh@xxxxxxxxxx> > > > > > > --- > > > > > > > > > > [...] > > > > > > > > > > > diff --git a/tests/cputestdata/x86_64-bogus-element.xml > > > > > > b/tests/cputestdata/x86_64-bogus-element.xml > > > > > > new file mode 100644 > > > > > > index 0000000000..79f98bad18 > > > > > > --- /dev/null > > > > > > +++ b/tests/cputestdata/x86_64-bogus-element.xml > > > > > > @@ -0,0 +1,3 @@ > > > > > > +<cpu> > > > > > > + <nonExistantElement/> > > > > > > +</cpu> > > > > > > > > > > I'm not persuaded that there's value in such test. > > > > > > > > > > > > > This was my approach to adding negative tests to the "virsh > > > > [hypervisor-]cpu-compare --validate" feature to make sure that > > > > schema > > > > violations are actually caught. If this is not the correct place to > > > > do > > > > so, I would be grateful for a pointer in the right direction. > > > > > > I'd find way more useful to actually test that all the XMLs we use in > > > the tests are validated, which does not seem to happen in this > > > series. > > > > > > This series adds schema validation only for a very synthetic negative > > > case which IMO doesn't make sense. > > > > > > Additionally we do have 'virschematest', which is meant to validate > > > all > > > XML documents in the tests. Based on the fact that you've added a > > > invalid XML example but it's name doesn't end in '-invalid.xml', > > > which > > > is the marker for the 'virschematest' worker to do negative > > > validation, > > > means that the cpu files are not validated. > > > > > > Rather than adding a synthetic case, wire up virschematest which will > > > make sure that we don't have bogus XMLs in the test suite. > > > > Thanks, I was unaware of this test. One problem I see is that > > tests/cputestdata contains xml files with different root elements: > > > > $ for i in tests/cputestdata/*.xml ; do xpath -q -e "name(/*)" $i ; > > done | sort | uniq -c > > 273 cpu > > 157 cpudata > > 16 cpuTest > > > > Adding 'DO_TEST_DIR("cpu.rng", "cputestdata");' to virschematest.c > > therefore fails for 157 + 16 files. I do not believe that we would want > > to individually list 273 files in virschematest.c and reworking the > > directory structure appears undesirable. > > If by undesirable you mean you don't want to touch it with a 3.048 m > pole, I understand. > > But it's not trivial to figure out from cputest.c or the filenames > which files belong to which tests, so separating them to different > directories might help readability. > > The 'cpudata' files should have -cpuid- in their name, so separating > them to a separate directory shouldn't be so difficult. > > And <cpuTest> is used as a wrapper for multiple <cpu> nodes for > baseline tests. Alternatively we can add a special schema file for tests which will declare <cpuTest> as a 'oneOrMore' of <cpu> elements, and will also describe <cpudata>. Or both. Specifically the special schema will be helpful to validate the concatenated files for baselining as they really aren't really validated otherwise if we want to do it via virschematest. Since the new version of your patches is validating all files and not just the negative cases it deals with my complaint just fine. I just didn't see a point in validating negative cases only without actually trying if we even pass on some files. Whether it's worth pursuing the change to virschematest is up to you.