Search Postgresql Archives

pg_dump in a production environment

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I have a web application backed by a PostgreSQL 7.4.6 database. It's an application with a fairly standard login process verified against the database.

I'd like to use pg_dump to grab a live backup and, based on the documentation, this would seem to be a realistic possibility. When I try, though, during business hours, when people are frequently logging in and otherwise using the application, the application becomes almost unusable (to the point where logins take on the order of minutes).

According to the documentation, pg_dump shouldn't block other operations on the database other than operations that operate with exclusive locks. Ordinarily, I run pg_autovacuum on the box, so I tried again after killing that, thinking that perhaps any substantial vacuum activity might affect pg_dump. I tried again to no avail.

Excepting the rest of the application, the login process should be completely read-only and shouldn't require any exclusive locks.

Connections don't really pile up excessively, and load on the machine does not get in the red zone. Is there anything else I should be noticing?

-tfo

--

Thomas F. O'Connell

Co-Founder, Information Architect

Sitening, LLC


Strategic Open Source: Open Your i™


http://www.sitening.com/

110 30th Avenue North, Suite 6

Nashville, TN 37203-6320

615-260-0005



[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux