Search Postgresql Archives

Re: Why there is 30000 rows is sample

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi,

On Sat, Apr 04, 2020 at 10:07:51AM +0300, Andrus wrote:
> Hi!
> 
> vacuumdb output:
> 
> vacuumdb: vacuuming database "mydb"
> INFO:  analyzing "public.mytable"
> INFO:  "mytable": scanned 2709 of 2709 pages, containing 10834 live rows and
> 0 dead rows; 10834 rows in sample, 10834 estimated total rows
> 
> For tables with more than 30000 rows, it shows that there are 30000 rows in sample.
> 
> postgresql.conf does not set  default_statistics_target value.
> It contains
> 
> #default_statistics_target = 100    # range 1-10000
> 
> So I expect that there should be 100 rows is sample.
> Why Postgres uses 30000 or number of rows in table for smaller tables ?
> 
> Is 30000 some magical value, how to control it.


That's because default_statistics_target's unit isn't 1 row but 300 rows, so
yeah the 300 here is a little bit of a magical value.

You can see more details about this value at
https://github.com/postgres/postgres/blob/master/src/backend/commands/analyze.c#L1723-L1756.





[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]

  Powered by Linux