-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On 10/13/06 10:47, John D. Burger wrote: > Erik Jones wrote: > [snip] > But with both approaches, the planner is just using the static > statistics gathered by ANALYZE to estimate the cost of each candidate > plan, and these statistics are based on sampling your data - they may be > wrong, or at least misleading. (In particular, the statistic for total > number of unique values is frequently =way= off, per a recent thread > here. I have been reading about this, idly thinking about how to > improve the estimate.) What about an ANALYZE FULL <table>, reading every record in the table, and ever node in every index, storing in pg_statistic (or some new, similar table) such items as the AVG and STD of the number of records in each page, and b-tree depth, keys per node, records per key and per segment? Maybe even "average distance between pages in the tablespace". This would let the optimizer know things like "the value of the field which is first segment of an index (and which is the only part of the index in the WHERE clause) describes 75% of the rows in the table, and the records are all packed in tightly in the pages, and the pages are close together", so the optimizer could decide "a table scan would be much more efficient". In some ways, this would be similar in functionality to the existing histogram created by ANALYZE, but would provide a slightly different picture. - -- Ron Johnson, Jr. Jefferson LA USA Is "common sense" really valid? For example, it is "common sense" to white-power racists that whites are superior to blacks, and that those with brown skins are mud people. However, that "common sense" is obviously wrong. -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.5 (GNU/Linux) iD8DBQFFMcu5S9HxQb37XmcRAvVJAJ0VFfEoxwrKn15VqPaZz54SNY4tPACg47zB r3hZ+HqHE/1bCJK/xNZzNRE= =OP9+ -----END PGP SIGNATURE-----