Paul Ramsey wrote > Though maybe with a really big table? (with really big > objects?) Though still, doesn't analyze just pull a limited sample > (30K approx max) so why would table size make any difference after a > certain point? Hi paul, "my" table is quite big (about 293.049.000 records) but the objects are not. nodes[] contains maximal 2000 bigint and tags[] up to some hundred chars, sometimes some thousands chars. watching the memory usage of the autovaccum process: is was getting bigger and bigger at nearly constant speed. some MB per minute, iir. i'm just recreating planet_osm_ways_nodes without "fastupdate=off" regards walter -- View this message in context: http://postgresql.nabble.com/autovacuum-worker-running-amok-and-me-too-tp5840299p5840485.html Sent from the PostgreSQL - general mailing list archive at Nabble.com. -- Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-general