Search Postgresql Archives

Re: pg_dump of database with numerous objects

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 5/31/20 8:05 AM, tony@xxxxxxxxxxxxxxxxxxx wrote:
I have always used pg_basebackup to backup my database and I have never had any issues.

I am now needing to upgrade to a new version of PostgreSQL and I am running into problems when pg_upgrade calls pg_dump. pg_dump stalled at: "pg_dump: saving database definition" for 24 hours before I killed the process.

Where you using the jobs option?:

https://www.postgresql.org/docs/12/pgupgrade.html

-j njobs
--jobs=njobs

    number of simultaneous processes or threads to use


My pg_class table contains 9,000,000 entries and I have 9004 schema.

I was able to get output from pg_dump if I used the -n option to dump schema with wildcards. I was able to use -n 'data???x' where x was a digit from 0 to 9. This way I was able to execute 10 concurrent pg_dump processes and dump the database in 30 minutes. I then dumped the public schema and used pg_dumpall to dump the globals.

Can anyone tell me if there is something else I need to do to manually dump the database? What I did do seems to have restored correctly on the upgraded server, but if I want to make sure that I haven't missed anything that will creep up on me.




--
Adrian Klaver
adrian.klaver@xxxxxxxxxxx





[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]

  Powered by Linux