Jorge Torralba <jorge.torralba@xxxxxxxxx> writes: > Doing some forensic work on a Postgres 9.5 database that is in testing with > only 19 relations and over 100,000 sequences. Once this database goes live, > it could see in excess of 1 Million sequences created due to the complexity > of the application. There are obvious risks such as pg_dump issues and slow > response when scanning catalog table for info. But, are there any serious > issues that can show up from this situation? I know theoretically, postgres > can have unlimited tables in a database. But, I am looking for some > realistic worse case scenarios in an environment like the one described. Reminds me of this talk: http://www.pgcon.org/2013/schedule/events/595.en.html 1M tables is a lot short of 1B, but still you'll start running into some of the same issues Alvaro described. Personally I'd look for another way to do it. (We've occasionally batted around the idea of merging all sequences into one catalog, which would help a lot; but the compatibility breakage that would ensue is a bit daunting.) regards, tom lane -- Sent via pgsql-admin mailing list (pgsql-admin@xxxxxxxxxxxxxx) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-admin