right now I am having about 7000 tables for individual stock and I use
perl to do inserts, it's very slow. I would like to use copy or other
bulk loading tool to load the daily raw gz data. but I need the split
the file to per stock files first before I do bulk loading. I consider
this a bit messy.
Are you committing each insert separately or doing them in batches using 'begin transaction' and 'commit'?
I have a database that I do inserts in from a text file. Doing a commit every 1000 transactions cut the time by over 90%.
--
Mike Nolan