Hi everybody!
I'm new in mailing list, and i have a little question.
The tables are:
postalcodes (place_id, code), PK(place_id, code) 600K of rws
places (id, name), PK(id), INDEX(name) 3M of rows
I've to insert another 600k of rows into postalcodes table, in a single
transaction, omitting duplicates.
The insert query is a prepared statement like this:
INSERT INTO postalcodes (place_id, code)
SELECT places.id, :code
FROM places
LEFT JOIN postalcodes (postalcodes.place_id = places.id and
postalcodes.code = :code)
WHERE places.name = :name AND postalcodes.place_id IS NULL
Inserting rows works well (3000 queries per second), but when i reach
30K of executed statements, the insert rate slows down to 500/1000
queries per second).
Doing a commit every 20K of inserts, the insert rate remain 3000 queries
per second.
There is a limit of inserts in a transaction?
--
Sent via pgsql-performance mailing list (pgsql-performance@xxxxxxxxxxxxxx)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-performance