Search Postgresql Archives

Re: optimizing import of large CSV file into partitioned table?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



2010/3/28 Thom Brown <thombrown@xxxxxxxxx>:

> The problem here is that you appear to require an index update, trigger
> firing and constraint check for every single row.  First thing I'd suggest
> is remove the indexes.  Apply that after your import, otherwise it'll have
> to update the index for every single entry.
+1

> And the trigger won't help
> either.  Import into a single table and split it out into further tables
> after if required.
note: partitioning could help if there were multiple physical volumes
/ spindles for data directory.
for maximizing performance, I would rather split the CSV input (with
awk/perl/whatever) before loading, to have one backend for each
partition loader.

> And finally the constraint should probably be applied
> after too, so cull any violating rows after importing.
+1



-- 
Filip Rembiałkowski
JID,mailto:filip.rembialkowski@xxxxxxxxx
http://filip.rembialkowski.net/

-- 
Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux