Just throwing out some more innovative ideas.
Materialized join tables, I have read somewhere. OK, difficult to keep
consistent with transactions. Forget that.
But, why not collect statistics on every join that is processed, even if
the query is interrupted. Then as more and more plans are run, and
interrupted for being too slow, statistics on the joins are collected
and can inform the optimizer next time not to use that approach.
Would work like magic for a user.
User writes a query. It runs 3 minutes and as no result. User interrupts
the query (THANKS PgSQL for allowing that, unlike Oracle!). Now the
statistics has already been gathered.
User reruns the query, not changing anything. Because the statistics on
(some of) the joins has been gathered, at least with an initial sample,
now the planner will likely choose a different plan. Say, now the
results come in at 2 minutes and the user is satisfied. But still more
complete statistics was collected.
Now the user changes a few query parameters and runs the query again, or
puts it into a more complex query. This time the planner has even more
statistics and chooses an even better plan. And lo and behold now the
results come in at 10 seconds!
At no point did the user have to analyze the explain plan, come up with
hints and tricks and nudges to the optimizer. And at no point did the
user have to become DBA to run some outlandish PL/SQL procedures for
which he does not have the license key or the special privileges.
But until that is done, please put in the pg_hint_plan.c. Hints don't
hurt. If you don't like them, don't use them.,
regards,
-Gunther
--
Sent via pgsql-performance mailing list (pgsql-performance@xxxxxxxxxxxxxx)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-performance