Search Postgresql Archives

Re: Backup/Restore of single table in multi TB database

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Wed, May 7, 2008 at 4:02 PM, John Smith <sodgodofall@xxxxxxxxx> wrote:

>  Does anyone see a problem with this approach (e.g., correctness,
>  performance, etc.)?  Or is there perhaps an alternative approach using
>  some other postgresql mechanism that I'm not aware of?

Did you already look at and reject pg_dump for some reason? You can
restrict it to specific tables to dump, and it can work concurrently
with a running system. Your database is large, but how large are the
individual tables you're interested in backing up? pg_dump will be
slower than a file copy, but may be sufficient for your purpose and
will have guaranteed correctness.

I'm fairly certain that you have to be very careful about doing simple
file copies while the system is running, as the files may end up out
of sync based on when each individual one is copied. I haven't done it
myself, but I do know that there are a lot of caveats that someone
with more experience doing that type of backup can hopefully point you
to.

-- 
- David T. Wilson
david.t.wilson@xxxxxxxxx


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux