Search Postgresql Archives

Re: COPY from .csv File and Remove Duplicates

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Thu, 11 Aug 2011, David Johnston wrote:

If you have duplicates with matching real keys inserting into a staging
table and then moving new records to the final table is your best option
(in general it is better to do a two-step with a staging table since you
can readily use Postgresql to perform any intermediate translations) As
for the import itself,

David,

  I presume what you call a staging table is what I refer to as a copy of
the main table, but with no key attribute.

  Writing the SELECT statement to delete from the staging table those rows
that already exist in the main table is where I'm open to suggestions.

In this case I would just import the data to a staging table without any
kind of artificial key, just the true key,

  There is no true key, only an artificial key so I can ensure that rows are
unique. That's in the main table with the 50K rows. No key column in the
.csv file.

Thanks,

Rich



--
Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux