I'd use pg_dump anyway - unless you have hundreds of databases, it makes
it easier to keep by backups separate.
I will do that then. Thanks.
Here is the script I use for my daily backups nothing special but it
works well. Just run it as a user with admin privs on the database. It
will pull the list of all your databases except templates and dump them out.
#!/bin/bash
export PG_BIN=/usr/local/pgsql/bin
export OUT_DIR=/db_backups/psql/
export TODAY=$(date "+%Y/%m/%d")
export BACKUP_DBS=`/usr/local/pgsql/bin/psql template1 -t -c "SELECT
datname FROM pg_database WHERE datname NOT LIKE 'template_' ORDER BY
datname"`
mkdir -p $OUT_DIR/$TODAY
echo "DataBase backup started at $(date)";
for i in $BACKUP_DBS
do
echo -n "Backing up $i...."
$PG_BIN/pg_dump -o -C $i > $OUT_DIR/$TODAY/$i
echo -n "Compressing...."
bzip2 -9 -f $OUT_DIR/$TODAY/$i
echo "Done"
done
echo -n "Backing up globals...."
$PG_BIN/pg_dumpall -g > $OUT_DIR/$TODAY/global.sql
echo "Done"
echo "DataBase ended at $(date)";
Gavin
---------------------------(end of broadcast)---------------------------
TIP 9: the planner will ignore your desire to choose an index scan if your
joining column's datatypes do not match