>
> I think an easy approach would be to COPY the CSV files into a separate database using psql's \copy command and then pg_dump that as separate insert statements with pg_dump —inserts.
>
This was my first thought too. However, as I understand, pg_dump --insert basically runs INSERT INTO ... sql for every row.
In other words, each row is un-prepared and executed individually. That is also not real life scenario.
You really need to describe what you consider to be a "real life scenario"; and probably give a better idea of creation and number of these csv files. In addition to describing the relevant behavior of the application you are testing.
If you want maximum realism you should probably write integration tests for your application and then execute those at high volume.
Or at minimum give an example of the output you would want from this unknown program...
David J.