Search Postgresql Archives

Errors in transactions

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Dear PostgresQL experts,

I thought I understood transactions until I found this problem.  I'm
sure it must be a common situation.  Can someone point me in the right
direction?

I have a simple table that records which flags are set for various
objects:

create table obj_flags (
	obj_id   integer,
	flag     text,
	primary key (obj_id, flag)
);

In my application, I have a function that wants to set flag F for a set
of objects.  First pseudo-code looks like this:

set_flag ( set<obj> s ) {
  SQL "begin;"
  foreach obj in s {
    SQL "insert into obj_flags values (" obj.id ", 'F');"
  }
  SQL "end;"
}

But the flag may already be set for some of the objects.  This is fine
as far as the application is concerned, but I get database errors
objecting to the duplicates.  Any error causes the entire transaction to
fail.

I thought this would be simple to fix, but I'm almost stuck.  Things
I've tried:

- I looked for some sort of "INSERT OR IGNORE" command that will
silently do nothing if the row already exists.

- I looked for a way of detecting and then clearing the error condition,
but there isn't one.

- I looked for a way of changing the transaction semantics so that it
would commit the results of those commands that did succeeded unless I
told it to ROLLBACK, but this seems impossible.

- I considered starting a new transaction after each error:

set_flag ( set<obj> s ) {
start:
    SQL "BEGIN;"
    foreach obj in s {
      SQL "insert into obj_flags values (" obj.id ", 'F');"
      if previous command failed {
        SQL "ROLLBACK;"
        s.remove obj
        goto start
      }
    }
    SQL "end;"
}

but it looks like it will increase the computational complexity from
O(sizeof(s)) to O(sizeof(s)^2), which is not great.

- I considered doing an explicit test for existence before each insert:

set_flag ( set<obj> s ) {
  SQL "begin;"
  foreach obj in s {
    SQL "select * from obj_flags where obj_id = " obj.id " and flag='F'"
    if tuples returned = 0 {
      SQL "insert into obj_flags values (" obj.id ", 'F');"
    }
  }
  SQL "end;"
}

but this requires TRANSACTION ISOLATION LEVEL SERIALIZEABLE in case
another connection adds the same flag between my SELECT and INSERT.

- I can't now see a difficulty with DELETEing any existing flag and then
inserting unconditionally, but I may have found an objection to this
previously:

set_flag ( set<obj> s ) {
  SQL "begin;"
  foreach obj in s {
    SQL "delete from obj_flags where obj_id = " obj.id " and flag='F'"
    SQL "insert into obj_flags values (" obj.id ", 'F');"
  }
  SQL "end;"
}

Surely this should be straightforward.  Can someone please point out the
error of my ways?

Many thanks in advance for your help.

Regards,

--Phil Endecott.




---------------------------(end of broadcast)---------------------------
TIP 7: don't forget to increase your free space map settings

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux