Hi experts,
Our application serves multiple tenants. Each tenant has the schema with a few hundreds of tables and few functions.
We have 2000 clients so we have to create 2000 schemas in a single database.
While doing this, i observed that the catalog tables pg_attribute, pg_class, pg_depend grow huge in count and size.
Do you think this will be a challenge during execution of every query ?
When Postgres parses an sql to find the best execution plan, does it scan any of these catalogs that could eventually take more time?
Any other challenges you have come across or foresee in such cases ?
Thanks,
Sammy.