Dear List,
On further investigation it is observed that a particular connection PID, which is using 32 GB of memory approx as per top command,
When i kill that particular pid with the help of pg_terminate_backend command it suddenly release lot of memory.
I confirmed it with the help of free command.
Please note that the mentioned PID was from one of our application which is connected by persistent connection.
Regards
On Sat, Feb 9, 2019 at 4:38 PM Rohit Arora <arora.leo9@xxxxxxxxx> wrote:
Dear List,I am getting frequent errors for "out of memory" on my one of the postgresql instance.It occurs on read as well as write operations/queries. Below is one such case2019-02-05 02:00:02.736 IST [30452] ERROR: out of memory2019-02-05 02:00:02.736 IST [30452] DETAIL: Failed on request of size 24 in memory context "CachedPlanQuery".2019-02-05 02:00:02.736 IST [30452] STATEMENT: insert into tableName (<col1>,<col2>,<col3>,<col4>,<col5>,<col6>,<col7>,<col8>,<col9>,<col10>,<col11>,<col12>,<col13>,<col14>,<col15>,<col16>,<col17>,<col18>,<col19>,<col20>,<col21>,<col22>,<col23>,<col24>,<col25>,<col26>,<col27>,<col28>,<col29>,<col30>,<col31>,<col32>,<col33>,<col34>,<col35>,<col36>,<col37>,<col38>,<col39>,<col40>,<col41>,<col42>, <col43>) values ( $1,$2,$3,$4,$5,$6,$7,$8,$9,$10,$11,$12,$13,$14,$15,$16,$17,$18,$19,$20,$21,$22,$23,$24,$25,$26,$27,$28,$29,$30,$31,$32,$33,$34,$35,$36,$37,$38,$39,$40,$41,$42, $43 )More details here: https://pastebin.com/raw/Np1ExJ63The Table <tableName> is a large table apprx 70 million rows and data size of approx 40 GB.Please note that the rows and data size is for the mentioned table but when the "out of memory" error occurs it comes for all the queries irrespective of size of table.As a matter of fact when such a server state is reached the server starts refusing*any* new connections altogether.Machine OS is Centos : 7.6Postgresql Version : 11.1shared_buffer is 15 GB (which is apprx 25% of max server ram)effective_cache_size is at 46GB ( apprx 75% of server ram).As per my observation when used memory (as per free command) reaches upto thelevel of shared_buffer size such errors comes.Any help will be highly appreciable.Regards