Chris Hoover a écrit :
One other option is to shut the database down competely, and then do a
copy of the file system the new server. I have done this when I need
to move a very large database to a new server. I can copy 500GB's in
a couple of hours, where restoring my large databases backups would
take 10+ hours. Just make sure you are keeping postgres at the same
version level.
HTH,
Chris
On 12/19/06, *Arnau* <arnaulist@xxxxxxxxxxxxxxxxxx
<mailto:arnaulist@xxxxxxxxxxxxxxxxxx>> wrote:
Hi all,
I've got a DB in production that is bigger than 2GB that dumping it
takes more than 12 hours. I have a new server to replace this old one
where I have restore the DB's dump. The problem is I can't afford to
have the server out of business for so long, so I need your advice
about
how you'd do this dump/restore. The big amount of data is placed
in two
tables (statistics data), so I was thinking in dump/restore all
except
this two tables and once the server is running again I'd dump/restore
this data. The problem is I don't know how exactly do this.
Any suggestion?
Thanks
--
Arnau
---------------------------(end of
broadcast)---------------------------
TIP 9: In versions below 8.0, the planner will ignore your desire to
choose an index scan if your joining column's datatypes do not
match
How many tables have you got in your database ?
If you have only a few tables you can dump them one at a time
pgdump -t ....
Olivier
begin:vcard
fn:Olivier Boissard
n:Boissard;Olivier
org:Cerene Services
adr:;;3 rue Archimede;La Chapelle Saint Luc;;10000;France
email;internet:olivier.boissard@xxxxxxxxx
tel;work:03.25.74.11.78
tel;fax:03.25.78.39.67
version:2.1
end:vcard