It really comes down to life cycle of the given data and I don’t think I would say regularly archive/purging data is a requirement. I have worked on all these type of systems; i.e.: 1. Regulatory applications where the transaction needed to be keep in the system for at least 7 to 10 years. 2. Application where legally the data needed to be removed from the system within 3 months (licensed data); new data was entering the system just as fast as it was exiting daily (total of 2TB of data). 3. A highly transactional system where the data was purged within 30 days; however, it was steamed in realtime to a datamart where was keep forever and no purge cycle was defined. Having a cron job to run vacuum-analyze manually is a security blanket. It is not needed if auto vacuum is tuned; however, it doesn’t hurt and it is a good CYA plan. Needing to run Vacuum FULL on a regular bases is a symptom to a larger underling issue with the system and that’s my entire point. |