If your backup file
still hit that limit even after being compressed, you may want to use the split
command to split it into chunks with the desired size that suit your
O/S.
Check out this
PostgreSQL doc page on handling large databases: http://www.postgresql.org/docs/8.1/interactive/backup.html
----
Husam
From: pgsql-admin-owner@xxxxxxxxxxxxxx [mailto:pgsql-admin-owner@xxxxxxxxxxxxxx] On Behalf Of renneyt@xxxxxxxxx Sent: Thursday, May 04, 2006 8:46 AM To: Jyry Kuukkanen Cc: Rodrigo Sakai; pgsql-admin@xxxxxxxxxxxxxx Subject: Re: [ADMIN] backup problem On Thu, 4 May 2006, Rodrigo Sakai wrote:Hi, I'm trying to do a backup of a database that is larger then 4 GB. But it gets an error when the file size gets 1.2 GB! I think its an Operational System problem (linux)! So, I want to know if exists some solution to backup my database?? The command that I used was= pg_dump -U postgres -d dbdeveloper -a -v -D -f 'backup.sql' The operational system is linux with etx filesystem, and the version of postgres is 7.4!!You can try: pg_dump -U postgres -d dbdeveloper -a -v -D |bzip2 -c >backup.sql.bz2 bzip2 compresses better than gzip. Restoration: bzcat backup.sql.bz2|psql - postgres -d dbdeveloper Cheers, I use the same but use the bzip2 '--best' parameter for best compression or is that the default? Also for the OP Rodrigo Sakai, have your tried Solaris 10 x86? It is a rock solid OS from Sun that I have used for 12 years - 6 of them with Postgres. The two have coexisted together without incident. It can be d/led for free at the Sun site. ********************************************************************** Thank you.
FADLD
Tag |