Search Postgresql Archives

Re: getting list of tables from command line

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Thu, 2007-11-01 at 20:25 +0100, hubert depesz lubaczewski wrote:
> On Wed, Oct 31, 2007 at 08:01:41AM -0700, Craig White wrote:
> > my script looks like this...
> > (all I want is to get a list of the tables into a text file pg_tables)
> 
> everybody else showed some ways, but i'll ask a question:
> i hope you're not treating it as a backup? bacause when you do it that
> way (on table-by-table basic) the "backup" is practically useless as it
> doesn't have internal consistency.
> 
> if you want to make backup, perhaps you can tell us exactly what you
> want to do, and why standard pg_dump is not enough.
> 
> if you're fully aware of the issue i pointed - sorry, i'm just trying to
> make sure you'll not shoot yourself in a foot.
----
No - thanks...you raised a very good point.

First I would like to thank all of those who responded (Reece, T.J.,
David, Filip, Ron) with great ideas...I learned a lot.

I also have pg_dumpall on a weekly basis and pg_dump each database
nightly. I learned that a long time ago when I was in early development
and migrated my fedora installation which updated postgres and my
database no longer worked.

I wanted the table separations nightly as insurance from things like bad
migrations (ruby on rails which alters the tables via scripting) and the
ability to reload the data from a table based on poorly conceived coding
(not that I would ever do such a thing), or to make it easier for me to
move data from my production database to my development database.

I am fortunate that even though I am now past 30 tables in my project
(production) and we've been adding data for a year and a half, the total
data backup is only 7 Megabytes (/var/lib/pgsql/data is only 132
megabytes) so I can't have too many different backups made nightly via
cron scripts at this point. I also turned on auto-vacuum in the
preferences but this method seems much more thorough.

For anyone interested, this is what I settled upon for my final script
(heck, I don't even bother tar/zip the things yet)...

*** begin pg_table_dump.scr ***
#/bin/sh
#
# Script to identify tables, backup schema and data separately and
# then finally, vacuum each table
#
DB_NAME=MY_DB
BACKUP_PATH=/home/backup/postgres/production
MY_SCHEMA=public
PG_USER=craig
#
psql -U $PG_USER \
  $DB_NAME \
  -c "SELECT tablename FROM pg_tables WHERE \
  schemaname = "\'$MY_SCHEMA\'";" | \
  grep -v 'tablename' | \
  grep -v [\--*] | \
  grep -v rows\) > $BACKUP_PATH/pg_tables
#
for i in `cat $BACKUP_PATH/pg_tables`
do
  pg_dump \
    --username=$PG_USER \
    --schema=$MY_SCHEMA \
    --table=$i \
    --schema-only \
    $DB_NAME > $BACKUP_PATH/schemas/$i.sql
  pg_dump \
    --username=$PG_USER \
    --schema=$MY_SCHEMA \
    --table=$i \
    --data-only \
    $DB_NAME > $BACKUP_PATH/data/$i.sql
  vacuumdb \
   --username=$PG_USER \
   --dbname=$DB_NAME \
   --table=$MY_SCHEMA.$i \
   --verbose \
   --full
done
*** end pg_table_dump.scr ***

Craig


---------------------------(end of broadcast)---------------------------
TIP 3: Have you checked our extensive FAQ?

               http://www.postgresql.org/docs/faq

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux