Search Postgresql Archives

Re: Backup Large Tables

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Are you dumping the whole database or just a single table? If it's the former, try the latter and see if you still get errors.

If pg_dump is not working, maybe some system table is hosed. What errors are you getting?

If you can get in via psql, log in as a superuser and execute:

COPY mytable TO 'mytable.txt';

That will dump the table data to a text file which can be re-imported into a new database using the COPY FROM command. Basically you're just doing part of what pg_dump does for you by hand.

-Casey

On Sep 21, 2006, at 9:19 PM, Charles Ambrose wrote:

Hi!

I encounter errors in dumping the database using pg_dump. The database i think is corrupt. It was looking for triggers and stored procedures that are now longer in the database. This is also the reason why I opted to create a program to dump the database.

On 9/22/06, Michael Nolan <htfoot@xxxxxxxxx> wrote: I have a table with over 6 million rows in it that I do a dump on every night. It takes less than 2 minutes to create a file that is around 650 MB.

Are you maybe dumping this file in 'insert' mode?
--
Mike Nolan


On 9/21/06, Charles Ambrose < jamjam360@xxxxxxxxx> wrote: Hi!

I have a fairly large database tables (say an average of 3Million to 4Million records). Using the pg_dump utility takes forever to dump the database tables. As an alternative, I have created a program that gets all the data from the table and then put it into a text file. I was also unsuccessfull in this alternative to dump the database.







[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux