Search Postgresql Archives

Re: Backup Large Tables

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 




On Sep 21, 2006, at 10:54 PM, Charles Ambrose wrote:

I have a fairly large database tables (say an average of  3Million to 4Million records).  Using the pg_dump utility takes forever to dump the 

Sounds like your either woefully mis-configured or woefully underpowered or have a short definition of "forever" :-)

Every night we take a dump of our several hundred million row DB in about 49 minutes.  We use the "pg_dump -Fc" format and that comes to a bit over 5GB of data compressed.


Attachment: smime.p7s
Description: S/MIME cryptographic signature


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux