Re: Performance degradation after successive UPDATE's

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Tue, Dec 06, 2005 at 11:08:07 +0200,
  Assaf Yaari <assafy@xxxxxxxxxxxx> wrote:
> Thanks Bruno,
> 
> Issuing VACUUM FULL seems not to have influence on the time.
That was just to get the table size back down to something reasonable.

> I've added to my script VACUUM ANALYZE every 100 UPDATE's and run the
> test again (on different record) and the time still increase.

Vacuuming every 100 updates should put an upperbound on how slow things
get. I doubt you need to analyze every 100 updates, but that doesn't
cost much more on top of a vacuum. However, if there is another transaction
open while you are doing the updates, that would prevent clearing out
the deleted rows, since they are potentially visible to it. This is something
you want to rule out.

> Any other ideas?

Do you have any triggers on this table? Are you updating any other tables
at the same time? In particular ones that are referred to by the problem table.


[Postgresql General]     [Postgresql PHP]     [PHP Users]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Yosemite]

  Powered by Linux