Search Postgresql Archives

Re: Avoiding duplicates (or at least marking them as such) in a "cumulative" transaction table.

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Mon, Mar 8, 2010 at 5:49 AM, Scott Marlowe <scott.marlowe@xxxxxxxxx> wrote:
> On Sun, Mar 7, 2010 at 1:45 AM, Allan Kamau <kamauallan@xxxxxxxxx> wrote:
>> Hi,
>> I am looking for an efficient and effective solution to eliminate
>> duplicates in a continuously updated "cumulative" transaction table
>> (no deletions are envisioned as all non-redundant records are
>> important). Below is my situation.
>
> Is there a reason you can't use a unique index and detect failed
> inserts and reject them?
>

I think it would have been possible make use of a unique index as you
have suggested, and silently trap the uniqueness violation.

But in my case (as pointed out in my previous lengthy mail) I am
inserting multiple records at once, which implicitly means a single
transaction. I think in this scenario a violation of uniqueness by
even a single record will lead to all the other records (in this
batch) being rejected either.

Is there perhaps a way to only single out the unique constraint
violating record(s) without having to perform individual record
inserts, I am following the example found here
"http://www.postgresql.org/docs/8.4/interactive/plpgsql-control-structures.html#PLPGSQL-ERROR-TRAPPING";.

Allan.

-- 
Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux