Is there a way to determine the needed amount of fsm pages?
I have a database that I started with 1,000,000 max_fsm_pages and I was
doing vacuum analyze verboze daily. Checked every couple of days to make
sure we had the right amount of fsm pages.
A few days ago I noticed that we got the notice "Consider increasing the
configuration parameter" and it recommended 1,216,864. So I first increased
it to 1.5 Million and changed to do two vacuum analyze per day.
Ever since I have been gradually increasing the numbers of max_fsm_pages
and every time we get about 200,000 over what I have set. Finally tried
going from 3 Millinon to 5 Million at it still suggests
that I need about 200K more than what I have.
Also have decreased the autovacuum_vacuum_scale_factor to 0.1, but given
that this database is already over 100GB I am thinking to make
autovacuum_vacuum_scale_factor 0.01.
Have 3 100GB+ databases and growing.
I also have set
autovacuum_vacuum_threshold = 50000 # min # of tuple updates before
autovacuum_analyze_threshold = 100000 # min # of tuple updates before
but doesn't seem to be helping.
Right now we are migrating millions of records from an old system so there
is a very high number of inserts/updates/deletes (updates/delete on some
temporary tables, but once inserted the data is never changed).
In the ouput of vacuum full analyze I see:
Current limits are: 5000000 page slots, 1000 relations, using 29362 KB.
Is that space in disk or memory?