Search Postgresql Archives

Re: batch insert/update

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Wed, 26 Dec 2007 20:48:27 +0100
Andreas Kretschmer <akretschmer@xxxxxxxxxxxxx> wrote:

> blackwater dev <blackwaterdev@xxxxxxxxx> schrieb:
> 
> > I have some php code that will be pulling in a file via ftp.
> > This file will contain 20,000+ records that I then need to pump
> > into the postgres db.  These records will represent a subset of
> > the records in a certain table.  I basically need an efficient
> > way to pump these rows into the table, replacing matching rows
> > (based on id) already there and inserting ones that aren't.  Sort
> > of looping through the result and inserting or updating based on
> > the presents of the row, what is the best way to handle this?
> > This is something that will run nightly.

> Insert you data to a extra table and work with regular SQL to
> insert/update the destination table. You can use COPY to insert the
> data into your extra table, this works very fast, but you need a
> suitable file format for this.

What if you know in advance what are the row that should be inserted
and you've a batch of rows that should be updated?

Is it still the fasted system to insert them all in a temp table with
copy?

What about the one that have to be updated if you've all the columns,
not just the changed ones?
Is it faster to delete & insert or to update?

updates comes with the same pk as the destination table.

thx

-- 
Ivan Sergio Borgonovo
http://www.webthatworks.it


---------------------------(end of broadcast)---------------------------
TIP 6: explain analyze is your friend

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux