Search Postgresql Archives

Re: dealing with file size when archiving databases

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Mon, Jun 20, 2005 at 09:28:51PM -0500, Andrew L. Gould wrote:
> I've been backing up my databases by piping pg_dump into gzip and 
> burning the resulting files to a DVD-R.  Unfortunately, FreeBSD has 
> problems dealing with very large files (>1GB?) on DVD media.  One of my 
> compressed database backups is greater than 1GB; and the results of a 
> gzipped pg_dumpall is approximately 3.5GB.  The processes for creating 
> the iso image and burning the image to DVD-R finish without any 
> problems; but the resulting file is unreadable/unusable.

Tom's response is certainly something to consider; also, note that if
you "pg_dump -t" each table separately, the dumps are not necessarily
consistent with one another, meaning that you could end up with an
unrecoverable backup if a transaction modifying two (foreign key-)
dependant tables happens to run after backing up one but before backing
up the other.

-- 
Alvaro Herrera (<alvherre[a]surnet.cl>)
"Las cosas son buenas o malas segun las hace nuestra opinión" (Lisias)

---------------------------(end of broadcast)---------------------------
TIP 6: Have you searched our list archives?

               http://archives.postgresql.org

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux