Re: "not related" code blocks for removal of dead rows when using vacuum and this kills the performance

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 




From: Laurenz Albe <laurenz.albe@xxxxxxxxxxx>
Sent: Tuesday, February 20, 2024 8:29 AM
>Re: "not related" code blocks for removal of dead rows when using vacuum and this kills the performance
>Laurenz Albe <laurenz.albe@xxxxxxxxxxx>
>​Lars Aksel Opsahl;​
>pgsql-performance@xxxxxxxxxxxxxxxxxxxx​
>On Tue, 2024-02-20 at 05:46 +0000, Lars Aksel Opsahl wrote:
>> If this is expected behavior it means that any user on the database that writes
>> a long running sql that does not even insert any data can kill performance for
>> any other user in the database.
>
>Yes, that is the case.  A long running query will hold a snapshot, and no data
>visible in that snapshot can be deleted.
>
>That can cause bloat, which can impact performance.
>

Hi

Thanks for the chat, seems like I finally found solution that seems work for this test code.

Adding a commit's  like here /uploads/031b350bc1f65752b013ee4ae5ae64a3/test_issue_67_with_commit.sql to master code even if there are nothing to commit seems to solve problem and that makes sense based on what you say, because then the master code gets a new visible snapshot and then releases the old snapshot.

The reason why I like to use psql as the master/Orchestration code and not C/Python/Bash and so is to make more simple to use/code and test.

Lars


[Postgresql General]     [Postgresql PHP]     [PHP Users]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Yosemite]

  Powered by Linux