Hi, On Tue, Jul 15, 2008 at 2:52 PM, Andrew Maclean <andrew.amaclean@xxxxxxxxx> wrote: > We have a database that grows in size quite quickly. Of course we > backup nightly and keep a weeks worth of data > > However we need to keep a few months data online, but the rest can be > archived as it will be unlikley that it will be used again. > > As I see it we can: > 1) Run a query to drop/delete old data, the downside here is that we lose it. > 2) Stop the database (this is important because clients are writing to > it), back it up, delete it and recreate the database. Has anyone done > this? Do they have a script for htis? It sounds like table partitioning could be useful in your situation, depending on what your data looks like, and how you want to query it. Its worth your taking the time to read: http://www.postgresql.org/docs/8.3/interactive/ddl-partitioning.html. If you're basically inserting a series of observations or something to a large table this could be useful - you can use it to increase the amount of data you can easily manage, and to automate something like a rolling 2-month window of online data. A script could be put together to periodically dump out the oldest partition, drop it, create a new partition, and maintain the associated triggers. Charles Duffy