On Sun, May 13, 2012 at 9:01 AM, Craig James <cjames@xxxxxxxxxxxxxx> wrote: > > In my experience (PG 8.4.x), the system can handle in the neighborhood of > 100,000 relations pretty well. Somewhere over 1,000,000 relations, the > system becomes unusable. It's not that it stops working -- day-to-day > operations such as querying your tables and running your applications > continue to work. But system operations that have to scan for table > information seem to freeze (maybe they run out of memory, or are > encountering an O(N^2) operation and simply cease to complete). > > For example, pg_dump fails altogether. After 24 hours, it won't even start > writing to its output file. The auto-completion in psql of table and column > names freezes the system. It takes minutes to drop one table. Stuff like > that. You'll have a system that works, but can't be backed up, dumped, > repaired or managed. > > As I said, this was 8.4.x. Things may have changed in 9.x. I think some of those things might have improved, but enough of them have not improved, or not by enough. So I agree with your assessment, under 9.2 having millions of sequences might technically work, but would render the database virtually unmanageable. Cheers, Jeff -- Sent via pgsql-performance mailing list (pgsql-performance@xxxxxxxxxxxxxx) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-performance