Tom Lane wrote:
"Jim C. Nasby" <jnasby@xxxxxxxxxxxxx> writes:
Is the only downside to a large value planning speed? It seems it would
be hard to bloat that too much, except in cases where people are
striving for millisecond response times, and those folks had better know
enough about tuning to be able to adjust the stats target...
It would be nice to have some *evidence*, not unsupported handwaving.
For that you'd need a large enough sample for statistics sizes people
use/need. To me it has always been a bit vague under what conditions I'd
need to change statistics sizes fro tables, I imagine I'm not the only
one. That makes it all the harder to determine a good default value.
I suppose it'd be useful to have some kind of measurement toolkit that
people can run on their databases to collect statistics about what
statistics sizes would be "optimal". There must be some mathematical way
to determine this on a given data set?
Being able to provide these numbers to you guys would then help in
determining what a good default statistics size is, and maybe even for
determining an algorithm to adjust statistics sizes on the fly...
Regards,
--
Alban Hertroys
alban@xxxxxxxxxxxxxxxxx
magproductions b.v.
T: ++31(0)534346874
F: ++31(0)534346876
M:
I: www.magproductions.nl
A: Postbus 416
7500 AK Enschede
// Integrate Your World //