Re: automated storage test framework

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi,

I do not agree with Hans on this, I would not put this kind of stuff to tests/ dir. Or it should be possible to disable the whole set as needed.

The tests described here are more verification using virtual machine than unit test system. We might create two separate test suites.. one for past-build verification of components and API and one for acceptance-testing.

I've been working on Mock unit testing framework for some time as I want to be able to run majority of tests during development (to see if I have broken something else). And I expect to put our Brno intern to this task too. Testing separated classes with their bindings faked will require almost nothing special on our machines.

Acceptance testing is then very useful when we have the final thing, but it is pain when you only need a quick check of your modifications. And those heavyweight tests require a lots of stuff installed and prepared on the testing machine. I already hate the autotools we have in anaconda, because we need all dependencies with proper versions to create a tarball.. Do you know how painful is preparing .srpm for koji on f12 because of this?

I like the idea of those tests and I think we need them, but do not make it mandatory for the developer to have all the requirements on his devel box or just .. do not make me run them when I need fast build-test-improve cycle (typically for stage1 loader changes) in mock or koji.

--
Martin Sivák
msivak@xxxxxxxxxx
Red Hat Czech
Anaconda team / Brno, CZ

----- "Hans de Goede" <hdegoede@xxxxxxxxxx> wrote:

> Hi,
> 
> On 05/10/2010 10:53 PM, James Laska wrote:
> > On Mon, 2010-05-10 at 15:05 -0400, Chris Lumens wrote:
> >> http://clumens.fedorapeople.org/anaconda-storage-test
> >>
> >> Over the past few weeks, I've been hard at work on creating an
> automated
> >> storage test framework for anaconda.  We came up with this concept
> >> sometime around FUDCon Toronto, but it's just finally come
> together.
> >>
> >> This framework provides a way to automatically run a kickstart
> >> partitioning snippet and validate that it does what you intended it
> to.
> >> I currently have it running against the latest anaconda package in
> >> rawhide, but there's no reason it couldn't instead be run against
> the
> >> git repo.  We'd just have to do scratch builds beforehand.
> >>
> >> Running against F13 and earlier is impossible since it requires my
> >> modular anaconda patches.
> >>
> >> I also don't see any reason why this couldn't be even more
> automated -
> >> instead of requiring you to kick off a run, we could easily script
> it to
> >> happen every time there's a new anaconda build provided we have
> the
> >> spare hardware to do so.  If we decide to do that, I'll have to
> make
> >> results reporting fancier as it's just logging to a local directory
> for
> >> now.
> >>
> >> My current strategy is to go through the partitioning section of
> the
> >> Fedora test matrix
> (http://fedoraproject.org/wiki/Test_Results:Current_Installation_Test)
> >> and convert all those into test cases.  Once we've got that done,
> we can
> >> start adding test cases to check very specific pieces - ignoredisk
> >> behavior, what happens when you have two existing disks with
> conflicting
> >> LVM metadata, conditions from a single bug, iSCSI, whatever.  It's
> >> really pretty flexible.
> >>
> >> Currently, this code doesn't live anywhere besides a directory on
> my
> >> computer.  It's not in a local anaconda git branch.  It's not in
> autoqa.
> >> Is there a good place for this stuff to live, or is it destined to
> be
> >> off on its own?
> >
> > Committing this to AutoQA seems appropriate to me.  I can see this
> > living alongside other installation related tests in AutoQA.  Not
> sure
> > if you have commit privs, but we can certainly fix that.
> >
> 
> Or maybe just make it part of the tests dir in anaconda git, see
> below.
> 
> > What frequency do you anticipate having these tests run?  Every new
> > anaconda build?
> >
> 
> I would really like to see these tests be run as part of a build,
> there
> is a reason a spec file can have a %check section, because test cases
> failing
> is a very valid reason to abort a build.
> 
> I'm not sure how feasible this is though, I guess that for building
> the
> livecd repo and thus network access is needed, and it will take quite
> a bit of resources too. If others agree it is desirable to run this
> at
> anacona (package / rpm) build time, we could chat to the
> infrastructure
> people about this.
> 
> >> What does everyone think?  Are my test cases too picky?  Not picky
> >> enough?  Is there something obviously stupid that I'm doing?  I'd
> like
> >> to get some help plowing through the test matrix before I open this
> up
> >> to the world at large to play with.
> >
> > This is amazing stuff!  Well done :)
> >
> 
> I have to second that, great work!
> 
> Regards,
> 
> Hans
> 
> 
> p.s.
> 
> Are you aware of the scsi_debug module ? That allows you to create
> fake scsi
> disks with very precise parameters. I guess you can probably make them
> identical
> enough to make multipath tools think they are a multipath. This would
> allow
> tests for things like partition aligning (faking 4k physical sector
> disks), etc.
> 
> _______________________________________________
> Anaconda-devel-list mailing list
> Anaconda-devel-list@xxxxxxxxxx
> https://www.redhat.com/mailman/listinfo/anaconda-devel-list

_______________________________________________
Anaconda-devel-list mailing list
Anaconda-devel-list@xxxxxxxxxx
https://www.redhat.com/mailman/listinfo/anaconda-devel-list


[Index of Archives]     [Kickstart]     [Fedora Users]     [Fedora Legacy List]     [Fedora Maintainers]     [Fedora Desktop]     [Fedora SELinux]     [Big List of Linux Books]     [Yosemite News]     [Yosemite Photos]     [KDE Users]     [Fedora Tools]
  Powered by Linux