On 6/27/23 11:47 AM, Jeremy Schneider wrote:
On 6/27/23 9:32 AM, Ben Chobot wrote:
We certainly have databases where far more than 100 tables are updated
within a 10 second period. Is there a specific concern you have?
Thank Ben, not a concern but I'm trying to better understand how common
this might be. And I think sharing general statistics about how people
use PostgreSQL is a great help to the developers who build and maintain it.
Given that Postgres is used up into the petabyte range, it is reasonable
to assume that it handles dealing with large multiples of tables. This
of course is based on sufficient hardware and proactive tuning.
Personally I think you are getting into the range of premature
optimization. There are so many ways to use Postgres that unless you
provide a detailed example of how you want to use it the survey you seem
to be requesting will likely have more cases that do not apply then
those that do. To me the way forward is to create a plan for what you
want accomplish and then ask specific questions based on that or build a
test/dev setup that institutes the plan and deal with the diversions, if
any, from the plan.
-Jeremy
--
Adrian Klaver
adrian.klaver@xxxxxxxxxxx