"Carlo Stonebanks" <stonec.register@xxxxxxxxxxxx> writes: > Thanks for the suggestion - this concept is pretty new to me. Can you expand > a bit on the idea of how to place such a "dummy" constraint on a function, > and the conditions on which it affects the planner? Let's say that you know that the function's result column "x" can only range from 1 to 1000. The planner does not know that, and has no statistics from which it could guess, so it's going to fall back on default selectivity estimates for any WHERE clause involving x. So for instance you could tack on something like FROM ... (select * from myfunc() where x <= 1000) ... which will change the actual query result not at all, but will cause the planner to reduce its estimate of the number of rows out by whatever the default selectivity estimate for an inequality is (from memory, 0.333, but try it and see). If that's too much or not enough, you could try some other clauses that will never really reject any rows, for instance where x >= 1 and x <= 1000 where x <> -1 where x is not null Of course this technique depends on knowing something that will always be true about your data, but most people can think of something... Now this is not going to affect the evaluation of the function itself at all. What it will do is affect the shape of a join plan built atop that function scan, since joins are pretty much all about minimizing the number of intermediate rows. > Would this require that > constraint_exclusion be set on? No. regards, tom lane ---------------------------(end of broadcast)--------------------------- TIP 2: Don't 'kill -9' the postmaster