Hello,
We have a streaming application (using apache flink and kafka) which populates data in the tables of a postgres database version 15.4.
Now while loading transactions data we also get some reference data information from source (for example customer information) and for these , we dont want to modify or override the existing customer data but want to keep the old data with a flag as inactive and the new record should get inserted with flag as active. So for such use case , should we cater this inside the apache flink application code or should we handle this using trigger on the table level which will execute on each INSERT and execute this logic?
I understand trigger is difficult to debug and monitor stuff. But here in this case , team mates is saying , we shouldn't put such code logic into a streaming application code so should rather handle through trigger.
I understand, technically its possible bith the way, but want to understand experts opinion on this and pros ans cons?
Regards
Sud