Re: vacuum can't find clog

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



"Gregory S. Williamson" <gsw@xxxxxxxxxxxxxxxx> writes:
> INFO:  vacuuming "public.weekly_log_data"
> ERROR:  could not access status of transaction 1081340514
> DETAIL:  could not open file "/data/postgres/psql8_test/pg_clog/0407": No such file or directory

This is a data-corruption problem, ie, something has stomped on the xmin
or xmax field of a tuple.  (It may or may not be relevant that the high
order half of that number looks like ASCII "s@" ...)  Since xmin/xmax
are the first fields looked at that can be checked with any specificity,
"could not access status of transaction" is a common symptom even when
most of a page has been trashed :-(

I don't recall whether we've fixed any bugs post-8.0.3 that might
explain this.  My experience is that if you inquire in any detail
you tend to find symptoms that look more like OS- or hardware-level
problems than anything that looks like it could have come from a
Postgres bug ... for instance, I recall finding a block of mail-message
text smack in the middle of a Postgres file in one case.

If you care to investigate exactly what happened, you could break out
pg_filedump and other implements of destruction and see if you can
determine exactly what's bogus in that table.  You could probably
also recover whatever data hasn't been overwritten, but the extent
of damage is impossible to guess at this point.

Otherwise, you might just consider this a wake-up call to update your
Postgres and kernel, and maybe run some memory and disk testing.

			regards, tom lane


[Index of Archives]     [KVM ARM]     [KVM ia64]     [KVM ppc]     [Virtualization Tools]     [Spice Development]     [Libvirt]     [Libvirt Users]     [Linux USB Devel]     [Linux Audio Users]     [Yosemite Questions]     [Linux Kernel]     [Linux SCSI]     [XFree86]

  Powered by Linux