Search Postgresql Archives

Re: Backup Large Tables

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi!

I encounter errors in dumping the database using pg_dump. The database i think is corrupt. It was looking for triggers and stored procedures that are now longer in the database. This is also the reason why I opted to create a program to dump the database.

On 9/22/06, Michael Nolan <htfoot@xxxxxxxxx> wrote:
I have a table with over 6 million rows in it that I do a dump on every night.  It takes less than 2 minutes to create a file that is around 650 MB.

Are you maybe dumping this file in 'insert' mode?
--
Mike Nolan


On 9/21/06, Charles Ambrose < jamjam360@xxxxxxxxx> wrote:
Hi!

I have a fairly large database tables (say an average of  3Million to 4Million records).  Using the pg_dump utility takes forever to dump the database tables. As an alternative, I have created a program that gets all the data from the table and then put it into a text file. I was also unsuccessfull in this alternative to dump the database.





[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux