Search Postgresql Archives

any solution for doing a data file import spawning it on multiple processes

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



hi there,

I am trying to import large data files into pg. 
for now i used the. xarg linux command to spawn the file line for line and set  and use the  maximum available connections. 

we use pg pool as connection pool to the database, and so try to maximize the concurrent data import of the file. 

problem for now that it seems to work well but we miss a line once in a while, and that is not acceptable. also it creates zombies ;(. 

does anybody have any other tricks that will do the job?

thanks,

Henk
-- 
Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general



[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux