Dear all, I need to optimize a database used by approx 10 people, I don't need to have the perfect config, simply to avoid stupid bottle necks and follow the best practices... The database is used from a web interface the whole work day with "normal" requests (nothing very special). And each morning huge tables are DELETED and all data is INSERTed new from a script. (Well, "huge" is very relative, it's only 400'000 records) For now, we only planned a VACUUM ANALYSE eacha night. But the database complained about checkpoint_segments (currently = 3) What should be changed first to improve speed ? * memory ? *??? Thanks a lot for any advice (I know there are plenty of archived discussions on this subject but it's always difficult to know what very important, and what's general as opposed to specific solutions) Have a nice day ! Denis -- Sent via pgsql-performance mailing list (pgsql-performance@xxxxxxxxxxxxxx) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-performance