Guillaume Smet schrieb:
On Mon, Jul 21, 2008 at 1:25 PM, Andreas Hartmann <andreas@xxxxxxxxxx> wrote:
SELECT pg_database.datname,
pg_size_pretty(pg_database_size(pg_database.datname)) AS size
FROM pg_database where pg_database.datname = 'vvz_live_1';
datname | size
---------------+---------
vvz_live_1 | 2565 MB
I wonder why the actual size is so much bigger than the data-only dump - is
this because of index data etc.?
More probably because the database is totally bloated. Do you run
VACUUM regularly or did you set up autovacuum?
Thanks for the hint!
I just verified that the autovacuum property is enabled. I did the
following to prepare the tests:
- setup two test databases, let's call them db_all and db_current
- import the dump from the live DB into both test DBs
- delete the old semester data from db_current, leaving only the current
data
Both test DBs were 600 MB large after this. I did a VACUUM FULL ANALYZE
on both of them now. db_all didn't shrink significantly (only 1 MB),
db_current shrunk to 440 MB. We're using quite a lot of indexes, I guess
that's why that much data are allocated.
-- Andreas
--
Andreas Hartmann, CTO
BeCompany GmbH
http://www.becompany.ch
Tel.: +41 (0) 43 818 57 01