Search Postgresql Archives

Re: importing large files

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi,

Le Friday 28 September 2007 10:22:49 olivier.scalbert@xxxxxxxxxxx, vous avez 
écrit :
> I need to import between 100 millions to one billion records in a
> table. Each record is composed of  two char(16) fields. Input format
> is a huge csv file.I am running on a linux box with 4gb of ram.
> First I create the table. Second I 'copy from' the cvs file. Third I
> create the index on the first field.
> The overall process takes several hours. The cpu seems to be the
> limitation, not the memory or the IO.
> Are there any tips to improve the speed ?

If you don't need to fire any trigger and trust the input data, then you may 
benefit from the pgbulkload project:
  http://pgbulkload.projects.postgresql.org/

The "conditions of usage" may be lighter than what I think they are, though.

Regards,
-- 
dim

---------------------------(end of broadcast)---------------------------
TIP 3: Have you checked our extensive FAQ?

               http://www.postgresql.org/docs/faq


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux