Dear Mr. Tom Lane, Thank you very much for your answer. It seems that the legacy application creates tables dynamically and the number of the created tables depends on the size of the input of the application. For the specific input which generated that error I've estimated a number of created tables of about 4000. Could be this the problem? With best regards, Sorin -----Original Message----- From: Tom Lane [mailto:tgl@xxxxxxxxxxxxx] Sent: Tuesday, March 27, 2007 6:37 AM To: Sorin N. Ciolofan Cc: pgsql-general@xxxxxxxxxxxxxx; pgsql-admin@xxxxxxxxxxxxxx; pgsql-performance@xxxxxxxxxxxxxx Subject: Re: [GENERAL] ERROR: out of shared memory "Sorin N. Ciolofan" <ciolofan@xxxxxxxxxxxx> writes: > I have to manage an application written in java which call another module > written in java which uses Postgre DBMS in a Linux environment. I'm new to > Postgres. The problem is that for large amounts of data the application > throws an: > org.postgresql.util.PSQLException: ERROR: out of shared memory AFAIK the only very likely way to cause that is to touch enough different tables in one transaction that you run out of lock entries. While you could postpone the problem by increasing the max_locks_per_transaction setting, I suspect there may be some basic application misdesign involved here. How many tables have you got? regards, tom lane