Tom Lane wrote:
"Paul B. Anderson" <paul.a@xxxxxxxxxxxxxxxxx> writes:
I did delete exactly one of each of these using ctid and the query then
shows no duplicates. But, the problem comes right back in the next
database-wide vacuum.
That's pretty odd --- I'm inclined to suspect index corruption.
I also tried reindexing the table.
Get rid of the duplicates (actually, I'd just blow away all the
pg_statistic entries for each of these tables) and *then* reindex.
Then re-analyze and see what happens.
Worst case you could just delete everything in pg_statistic, reindex it,
do a database-wide ANALYZE to repopulate it. By definition there's not
any original data in that table...
regards, tom lane
---------------------------(end of broadcast)---------------------------
TIP 4: Have you searched our list archives?
http://archives.postgresql.org
Hello, newbe here..
I seem to have run across this same error. And I believe I can
re-create it too.
I'm running 8.1.4 on slackware-(almost 11).
I have two scripts, one create's tables and indexes, the other has lots
of "copy from stdin" statements. The very last line is
VACUUM VERBOSE ANALYZE;
which eventually gives me the same error. This is a test box, with test
data, and this seemed to work:
delete from pg_statistic;
reindex table pg_statistic;
vacuum analyze;
So I'm ok, but I tried it again, by dropping the database and re-running
both scripts and got the same error again. So thought I'd offer a test
case if there was interest.
The data is 46 meg compressed and 500'ish meg uncompressed. Its a
little bit sensitive, so I'd like to be a little discreet with it, but
can put it on a website for a developer to download.
Please contact me privately for a link:
andy@xxxxxxxxxxxxxxx
...Oh, also, this box has locked up on me a few times, so its not the
most stable thing in the world. It did not lock up during import of
this data, but there is flaky hardware in there someplace, so it could
very well be a hardware issue.
-Andy