Search Postgresql Archives

Re: Out of memory on vacuum analyze

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Mon, 2007-02-19 at 12:47 -0600, John Cole wrote:
> I have a large table (~55 million rows) and I'm trying to create an index
> and vacuum analyze it.  The index has now been created, but the vacuum
> analyze is failing with the following error:
> 
> ERROR:  out of memory
> DETAIL:  Failed on request of size 943718400.
> 
> I've played with several settings, but I'm not sure what I need to set to
> get this to operate.  I'm running on a dual Quad core system with 4GB of
> memory and Postgresql 8.2.3 on W2K3 Server R2 32bit.
> 
> Maintenance_work_mem is 900MB
> Max_stack_depth is 3MB
> Shared_buffers is 900MB
> Temp_buffers is 32MB
> Work_mem is 16MB
> Max_fsm_pages is 204800
> Max_connections is 50
> 

You told PostgreSQL that you have 900MB available for
maintenance_work_mem, but your OS is denying the request. Try *lowering*
that setting to something that your OS will allow. That seems like an
awfully high setting to me.

Regards,
	Jeff Davis



[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux