Re: update 600000 rows

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Loïc Marteau <okparanoid@xxxxxxx> wrote ..
> Steve Crawford wrote:
> > If this
> > is correct, I'd first investigate simply loading the csv data into a
> > temporary table, creating appropriate indexes, and running a single 
> > query to update your other table.

My experience is that this is MUCH faster. My predecessor in my current position was doing an update from a csv file line by line with perl. That is one reason he is my predecessor. Performance did not justify continuing his contract.
 
> i can try this. The problem is that i have to make an insert if the 
> update don't have affect a rows (the rows don't exist yet). The number
> of rows affected by insert is minor regards to the numbers of updated 
> rows and was 0 when i test my script). I can do with a temporary table
> : update all the possible rows and then insert the rows that are in 
> temporary table and not in the production table with a 'not in' 
> statement. is this a correct way ?

That's what I did at first, but later I found better performance with a TRIGGER on the permanent table that deletes the target of an UPDATE, if any, before the UPDATE. That's what PG does anyway, and now I can do the entire UPDATE in one command.

---------------------------(end of broadcast)---------------------------
TIP 2: Don't 'kill -9' the postmaster

[Postgresql General]     [Postgresql PHP]     [PHP Users]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Yosemite]

  Powered by Linux