At work I am creating a standard postgresql benchmark suite based on the queries and operations that we commonly do. A couple of questions + Should I shutdown/restart the DB between runs? + How much bigger than memory should my tables be to have a good benchmark? One issue to keep in mind is that the benchmark DB will be only a subset of the real DBs to make it easier to copy to multiple machines. Once we show improvements in the benchmark subset after hardware/configuration/DB redesign then we would validate against the full sized DBs in the different machines. The goals are to benchmark different settings and machines to work on improving performance by changing the DB structures (ie index changes, DB re-design) and by buying/upgrading hardware.