In order to test a real life scenario (and use it for benchmarking) I want to load large number of data from csv files.
The requirement is that the load should happen like an application writing to the database ( that is, no COPY command).
Is there a tool which can do the job. Basically parse the csv file and insert it to the database row by row.
I'm skeptical that injesting CSV of any form, even if you intentionally blow things up by converting into:
BEGIN;
INSERT INTO tbl VALUES ('','','');
COMMIT;
BEGIN;
INSERT INTO tbl VALUES ('','','');
COMMIT;
(which is what auto-commit mode looks like)
Is going to provide a meaningful benchmark for application-like usage patterns.
But anyway, I'm not familiar with any tools that make doing this particularly simple. In most situations like this I'll just import the CSV into a spreadsheet and create a formula that builds out the individual SQL commands. Whether that's useful depends a lot on how often the source CSV is updated.
That said, I have the following tool to be generally helpful in this area - though I'm thinking it doesn't do what you want here.
David J.