I have a large table (~55 million rows) and I'm trying to create an index and vacuum analyze it. The index has now been created, but the vacuum analyze is failing with the following error: ERROR: out of memory DETAIL: Failed on request of size 943718400. I've played with several settings, but I'm not sure what I need to set to get this to operate. I'm running on a dual Quad core system with 4GB of memory and Postgresql 8.2.3 on W2K3 Server R2 32bit. Maintenance_work_mem is 900MB Max_stack_depth is 3MB Shared_buffers is 900MB Temp_buffers is 32MB Work_mem is 16MB Max_fsm_pages is 204800 Max_connections is 50 Any help would be greatly appreciated. Thanks, John Cole -- No virus found in this outgoing message. Checked by AVG Free Edition. Version: 7.5.441 / Virus Database: 268.18.2/692 - Release Date: 2/18/2007 4:35 PM This email and any files transmitted with it are confidential and intended solely for the use of the individual or entity to whom they are addressed. If you have received this email in error please notify the sender. This message contains confidential information and is intended only for the individual named. If you are not the named addressee you should not disseminate, distribute or copy this e-mail.