I'm developing a search engine using the postgresql's databas. I've already doing some tunnings looking increase the perform. Now, I'd like of do a realistic test of perfom with number X of queries for know the performance with many queries. What the corret way to do this?
I guess the only way to know how it will perform with your own application is to benchmark it with queries coming from your own application. You can create a test suite with a number of typical queries and use your favourite scripting language to spawn a number of threads and hammer the database. I find it interesting to measure the responsiveness of the server while torturing it, simply by measuring the time it takes to respond to a simple query and graphing it. Also you should not have N threads issue the exact same queries, because then you will hit a too small dataset. Introduce some randomness in the testing, for instance. Benchmarking from another machine makes sure the test client's CPU usage is not a part of the problem.