Re: [HACKERS] pg_dump and thousands of schemas

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Denis <socsam@xxxxxxxxx> writes:
> Tom Lane-2 wrote
>> Hmmm ... so the problem here isn't that you've got 2600 schemas, it's
>> that you've got 183924 tables.  That's going to take some time no matter
>> what.

> I wonder why pg_dump has to have deal with all these 183924 tables, if I
> specified to dump only one scheme: "pg_dump -n schema_name" or even like
> this to dump just one table "pg_dump -t 'schema_name.comments' "  ?

It has to know about all the tables even if it's not going to dump them
all, for purposes such as dependency analysis.

> We have a web application where we create a schema with a number of tables
> in it for each customer. This architecture was chosen to ease the process of
> backup/restoring data.

I find that argument fairly dubious, but in any case you should not
imagine that hundreds of thousands of tables are going to be cost-free.

			regards, tom lane


-- 
Sent via pgsql-performance mailing list (pgsql-performance@xxxxxxxxxxxxxx)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-performance


[Postgresql General]     [Postgresql PHP]     [PHP Users]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Yosemite]

  Powered by Linux