As a Fintech company, we MUST audit (by force of law) all DML operations (select, insert, update, delete) occurring on our PostgreSQL databases.
Today we used PostgreSQL log files to do this and tested pgAudit too.
In these cases (native log files and pgAudit) we having a lot of contention on backends and high values in the instance CPU (system and wait %). This only happens when we on high database TPS rate.
I think that an unique PostgreSQL logger process can't attend all backend processes when trying to pipe data on it, causing contention and performance slowdown.
So, I understand that the PostgreSQL logs rule is not to log everything happening in the database in a high TPS scenario.
I know that we can use triggers to have an audit process, but I need to audit selects too, and rules only run as instead for selects.
So, I'm looking for another way to this all DML audit and will appreciate so much your opinion on this.
Thanks for all the advice and have a happy week.
Please read our privacy policy here on how we process your personal data in accordance with the General Data Protection Regulation (EU) 2016/679 (the “GDPR”) and other applicable data protection legislation.