Hello all,
I am trying to help the Django project by investigating if there should
be some default batch size limits for insert and delete queries. This is
realted to a couple of tickets which deal with SQLite's inability to
deal with more than 1000 parameters in a single query. That backend
needs a limit anyways. It might be possible to implement default limits
for other backends at the same time if that seems necessary.
If I am not mistaken, there are no practical hard limits. So, the
question is if performance is expected to collapse at some point.
Little can be assumed about the schema or the environment. The inserts
and deletes are going to be done in one transaction. Foreign keys are
indexed and they are DEFERRABLE INITIALLY DEFERRED by default.
PostgreSQL version can be anything from 8.2 on.
The queries will be of form:
insert into some_table(col1, col2) values (val1, val2), (val3,
val4), ...;
and
delete from some_table where PK in (list_of_pk_values);
So, is there some common wisdom about the batch sizes? Or is it better
to do the inserts and deletes in just one batch? I think the case for
performance problems needs to be strong before default limits are
considered for PostgreSQL.
The tickets in question are:
https://code.djangoproject.com/ticket/17788 and
https://code.djangoproject.com/ticket/16426
- Anssi Kääriäinen
--
Sent via pgsql-performance mailing list (pgsql-performance@xxxxxxxxxxxxxx)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-performance