0ut of Memory Error during Vacuum Analyze

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



 
This is the second time I'm getting out of memory error when I start a
database vacuum or try to vacuum any table. Note this machine has been
used for data load batch purposes. 

=# vacuum analyze code;
ERROR:  out of memory
DETAIL:  Failed on request of size 1073741820.

I'm running Postgres 8.1.1 on RedHat 2.6 kernel (HP server). 
My maintenance work area never been changed. It's set to 1GB.
(maintenance_work_mem = 1048576). Physical memory: 32 GB.  

Bouncing the database does not help. 

Two workarounds I have used so far:

  1) Decreasing the maintenance_work_mem to 512MB, vacuum analyze would
work just fine.

Or 

  2) Bouncing the server (maintaining the original 1GB
maintenance_work_mem) would also work.

I have not had that error on the production instances (which are
identical copies of the loading instance) - only the loading instance..

Any explanation as to why and how to avoid that ?  Thanks


----
 
    Husam  
**********************************************************************
This message contains confidential information intended only for the use of the addressee(s) named above and may contain information that is legally privileged.  If you are not the addressee, or the person responsible for delivering it to the addressee, you are hereby notified that reading, disseminating, distributing or copying this message is strictly prohibited.  If you have received this message by mistake, please immediately notify us by replying to the message and delete the original message immediately thereafter.

Thank you.

                                   FADLD Tag
**********************************************************************



[Postgresql General]     [Postgresql PHP]     [PHP Users]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Yosemite]

  Powered by Linux