Search Postgresql Archives

How to skip duplicate records while copying from CSV to table in Postgresql using "COPY"

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi,

I am copying the data from a CSV file to a Table using "COPY" command. But one thing that I got stuck, is how to skip duplicate records while copying from CSV to tables. By looking at the documentation, it seems, Postgresql don't have any inbuilt too to handle this with "copy" command. By doing Google I got below 1 idea to use temp table.

http://stackoverflow.com/questions/13947327/to-ignore-duplicate-keys-during-copy-from-in-postgresql

I am also thinking what if I let the records get inserted, and then delete the duplicate records from table as this post suggested - http://www.postgresql.org/message-id/37013500.DFF0A64A@xxxxxxxxxxxxxxxxxxxx.

Both of the solution looks like doing double work. But I am not sure which is the best solution here. Can anybody suggest which approach should I adopt ? Or if any better ideas you guys have on this task, please share.

Thanks in advance!

-- 
================
Regards,
Arup Rakshit
================
Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it.

--Brian Kernighan


-- 
Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general




[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux