We have a database that grows in size quite quickly. Of course we backup nightly and keep a weeks worth of data However we need to keep a few months data online, but the rest can be archived as it will be unlikley that it will be used again. As I see it we can: 1) Run a query to drop/delete old data, the downside here is that we lose it. 2) Stop the database (this is important because clients are writing to it), back it up, delete it and recreate the database. Has anyone done this? Do they have a script for htis? I would appreciate any comments about what approaches have been used that work. Thanks for any info. Andrew -- ___________________________________________ Andrew J. P. Maclean Centre for Autonomous Systems The Rose Street Building J04 The University of Sydney 2006 NSW AUSTRALIA Ph: +61 2 9351 3283 Fax: +61 2 9351 7474 URL: http://www.acfr.usyd.edu.au/ ___________________________________________