Search Postgresql Archives

Backup Large Tables

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi!

I have a fairly large database tables (say an average of  3Million to 4Million records).  Using the pg_dump utility takes forever to dump the database tables. As an alternative, I have created a program that gets all the data from the table and then put it into a text file. I was also unsuccessfull in this alternative to dump the database.

As I see it, I can dump the tables by gradually getting data and dumping it. I plan to select a number of records from the table then dump it to a text file. This process continues until all records in the table are obtained. With this aproach I need a primary key that uniquely identifies each record so that each pass of getting data from the tables will not get data that has already been processed.
Problem with this approach though is that my dumping utility will not be generic.

Are there any alternatives?

Thanks for help in advance.

Thanks!









[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux