Search Postgresql Archives

Re: backup-strategies for large databases

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



i looked into data partitioning and it is definitely something we will use
soon. but, as far as the backups are concerned, how can i take a backup
incrementally? if i get it correctly, the idea is to partition a big table
(using a date field for example) and then take each night for example a dump
of the 'daily' partition. so that the dump of this specific table will be
relatively small to the size of the initial table. is that right?

so, we are talking about logical backups without PITR. i am not saying that
it's a bad idea, i just want to make sure that i got it right. 

thank you again all for your answers

--
View this message in context: http://postgresql.1045698.n5.nabble.com/backup-strategies-for-large-databases-tp4697145p4702690.html
Sent from the PostgreSQL - general mailing list archive at Nabble.com.

-- 
Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux