Hi Scott! The problem is that my test database has several tables with many links between them, so I have no idea which 1000 rows to get from which table. The only thing I can do is run the program that connects to that database and tell it to run on a sample of the database. I can get a log of all the queries that are executed, but I was wondering if there was a more general solution where I could use a "modified/hacked" postgres driver and catch all the rows of all the tables that were accessed during those queries. I could then simply insert them into the test database and in theory my program should run the same if I used it instead of the real one (assuming its configure to run on the same sample). Daniel Shane >>>QUOTE I'd create a test schema, set the search path on your test user to just that schema. And you could create the tables something like so: CREATE TABLE test.foo AS SELECT * FROM public.foo LIMIT 1000; Scott <<<QUOTE Daniel Shane wrote: > Hi all! > > I have an interesting problem here that I think could be of interest to everyone. I in the process of writing test cases for our applications and there is one problem I am facing. To be able to test correctly, I need to create a small database (a sample if you want) from a very large one so that I can run some tests on a subset of the data. > > Sometimes you are asked to do this but know nothing about the database in advance (ugh!). > > I could create several queries and build it myself by trial and error, but I was wondering if a more general approach could be elaborated. > > ... -- Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-general