Will give that a try. thanks.
was actually interested if the 2nd approach is common practice or if
there are some reasons not to do it that way.
Alex
Sean Davis wrote:
On 11/9/05 9:45 AM, "Alex" <alex@xxxxxxxxxxxxxxx> wrote:
Hi,
have just a general question...
I have a table of 10M records, unique key on 5 fields.
I need to update/insert 200k records in one go.
I could do a select to check for existence and then either insert or update.
Or simply insert, check on the error code an update if required.
The 2nd seems to be to logical choice, but will it actually be faster
and moreover is that the right way to do it?
Probably the fastest and most robust way to go about this if you have the
records in the form of a tab-delimited file is to COPY or \copy (in psql)
them into a separate loader table and then use SQL to manipulate the records
(check for duplicates, etc) for final insertion into the table.
Sean
---------------------------(end of broadcast)---------------------------
TIP 6: explain analyze is your friend
---------------------------(end of broadcast)---------------------------
TIP 1: if posting/reading through Usenet, please send an appropriate
subscribe-nomail command to majordomo@xxxxxxxxxxxxxx so that your
message can get through to the mailing list cleanly