Here is a sample dump that takes a long time to be written by pg_dump: http://postgresql.1045698.n5.nabble.com/file/n5710183/test.dump.tar.gz test.dump.tar.gz (the file above has 2.4Mb, the dump itself has 66Mb) This database has 2,311 schemas similar to those in my production database. All schemas are empty, but pg_dump still takes 3 hours to finish it on my computer. So now you can imagine my production database with more than 20,000 schemas like that. Can you guys take a look and see if the code has room for improvements? I generated this dump with postgresql 9.1 (which is what I have on my local computer), but my production database uses postgresql 9.0. So it would be great if improvements could be delivered to version 9.0 as well. Thanks a lot for all the help! Hugo -- View this message in context: http://postgresql.1045698.n5.nabble.com/pg-dump-and-thousands-of-schemas-tp5709766p5710183.html Sent from the PostgreSQL - performance mailing list archive at Nabble.com. -- Sent via pgsql-performance mailing list (pgsql-performance@xxxxxxxxxxxxxx) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-performance