Search Postgresql Archives

Re: huge price database question..

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 





right now I am having about 7000 tables for individual stock and I use
perl to do inserts, it's very slow. I would like to use copy or other
bulk loading tool to load the daily raw gz data. but I need the split
the file to per stock files first before I do bulk loading. I consider
this a bit messy.

Are you committing each insert separately or doing them in batches using 'begin transaction' and 'commit'?

I have a database that I do inserts in from a text file. Doing a commit every 1000 transactions cut the time by over 90%.  
--
Mike Nolan

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux