Re: Variable (degrading) performance

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Vladimir Stankovic wrote:
What I am hoping to see is NOT the same value for all the executions of the same type of transaction (after some transient period). Instead, I'd like to see that if I take appropriately-sized set of transactions I will see at least steady-growth in transaction average times, if not exactly the same average. Each chunk would possibly include sudden performance drop due to the necessary vacuum and checkpoints. The performance might be influenced by the change in the data set too. I am unhappy about the fact that durations of experiments can differ even 30% (having in mind that they are not exactly the same due to the non-determinism on the client side) . I would like to eliminate this variability. Are my expectations reasonable? What could be the cause(s) of this variability?

You should see that if you define your "chunk" to be long enough. Long enough is probably hours, not minutes or seconds. As I said earlier, checkpoints and vacuum are a major source of variability.

--
  Heikki Linnakangas
  EnterpriseDB   http://www.enterprisedb.com


[Postgresql General]     [Postgresql PHP]     [PHP Users]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Yosemite]

  Powered by Linux