Search Postgresql Archives

Re: Long term database archival

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 7/7/2006 17:49, "Csaba Nagy" <nagy@xxxxxxxxxxxxxx> wrote:

> On Thu, 2006-07-06 at 20:57, Karl O. Pinc wrote:
>> Hi,
>> 
>> What is the best pg_dump format for long-term database
>> archival?  That is, what format is most likely to
>> be able to be restored into a future PostgreSQL
>> cluster.
> 
>> Should we want to restore a 20 year old backup
>> nobody's going to want to be messing around with
>> decoding a "custom" format dump if it does not
>> just load all by itself.
> 
> Karl, I would say that if you really want data from 20 years ago, keep
> it in the custom format, along with a set of the sources of postgres
> which created the dump. then in 20 years when you'll need it, you'll
> compile the sources and load the data in the original postgres
> version... of course you might need to also keep an image of the current
> OS and the hardware you're running on if you really want to be sure it
> will work in 20 years :-)
> 
> Cheers,
> Csaba.
> 

Depending on the size of data (if it isn't too large) you could consider
creating a new database for archives, maybe even one for each year.

This can be on an old server or backup server instead  of the production
one.

Unless the data is too large you can  dump/restore the archive data to a new
pg version as you upgrade meaning the data will always be available and you
won't have any format issues when you want to retrieve the data.




[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux