"Henry" <henry@xxxxxxxxx> writes: > I'm trying to code a function to copy rows from one machine to another > using dblink and cursors: What PG version is this, exactly? > perform dblink_connect ('dbname=db1...host=othermachine.com'); > perform dblink_open ('cur_other1', 'SELECT col1 FROM tab1'); > loop > fnd := 0; > for rec in > -- grab a 1000 rows at a time > SELECT col1 FROM dblink_fetch ('cur_other1', 1000) > AS tab1 (col1 text) > loop > begin > INSERT INTO tab1 (col1) VALUES (rec.col1); > ... > exception when unique_violation then > -- ignore dups > end; > fnd := 1 > end loop; > if fnd = 0 then > exit; > end if; > end loop; > perform dblink_close ('cur_other1'); > perform dblink_disconnect(); I don't think your problem has anything to do with dblink per se. The repeated begin/exception blocks are apparently managing to leak some memory per iteration. I can't tell whether this represents a known (and perhaps already fixed) bug; it very likely depends on details you haven't shown us. Do you want to try to put together a self-contained test case? (Again, you likely don't need dblink to exhibit the issue.) If you just want something that works now, try probing for an existing entry before inserting, instead of relying on catching an exception. regards, tom lane