Thanks Ron.. pgbackrest and barman seem to b good options..
On Sat, 16 May 2020, 02:26 Ron, <ronljohnsonjr@xxxxxxxxx> wrote:
For a database that size, I'd install pgbackrest, since it features parallel backups and compression. With it, I'd do monthly full backups with daily differential backups.
(If it's mostly historical data, I'd split the database into multiple instances, so that older data rarely needs to be backed up. The application, of course, would have to be modified.)
On 5/15/20 8:26 AM, Suhail Bamzena wrote:
Thanks Rory, the machine has the capacity to pull through pg_dumps but like u rightly mentioned incremental backups mean that we will need to work with the wal's.. 18TB is what is the scary part and with compression I dont see it being less than 2TB a day...
On Fri, 15 May 2020, 17:02 Rory Campbell-Lange, <rory@xxxxxxxxxxxxxxxxxx> wrote:
On 15/05/20, Suhail Bamzena (suhailsalem@xxxxxxxxx) wrote:
> Hello All,
> I have very recently inherited an 18 TB DB that is running version 9.2.
> Apparently this database has never been backed up and I have been tasked to
> set in a periodic backup routine (weekly full & daily incremental) and dump
> it into a NAS. What is the best way to go about this? Did some reading and
> hear that pgbackrest does a good job with such huge sizes. Your expert
> advise is needed.
Incremental backups suggest the need to backup WAL archives. See
https://www.postgresql.org/docs/9.2/continuous-archiving.html
pgbackrest looks very cool but we haven't used it.
A very simple solution could be just to dump the database daily with
pg_dump, if you have the space and machine capacity to do it. Depending
on what you are storing, you can achieve good compression with this, and
it is a great way of having a simple file from which to restore a
database.
Our ~200GB cluster resolves to under 10GB of pg_dump files, although
18TB is a whole different order of size.
Rory
--
Angular momentum makes the world go 'round.