PgQ can be used this purpose. Idea is to have triggers on table that push events into queue and then on that queue you can do whatever suits you best. As we don't want to keep these logs online PgQ is most conenient as it efficiently removes them as soon as they are handled.
PgQ - table_dispatcher.py
Has url encoded events as data source and writes them into table on target database.
Used to partiton data. For example change log's that need to kept online only shortly can be written to daily tables and then dropped as they become irrelevant.
Also allows to select which columns have to be written into target database
Creates target tables according to configuration file as needed
PgQ - cube_dispatcher.py
Has url encoded events as data source and writes them into partitoned tables in target database. Logutriga is used to create events.
Used to provide batches of data for business intelligence and data cubes.
Only one instance of each record is stored. For example if record is created and then updated twice only latest version of record stays in that days table.
Does not support deletes (not that it is hard to support just we have no need for it).
PgQ - queue_archiver.py
Writes queue contents into file. Used for backing up queue contents for safety.
regards,
Asko
On 9/13/07, Ottavio Campana <ottavio@xxxxxxxxxxxxx> wrote:
I need to generate a diff (or something similar) of a table, day by day.
What is the best way to tack insert/update/delete operations? I have two
ideas, and I'd like to hear your opinion:
1) pg_dump each day and run diff
2) modify some triggers we use and store the information in another table
I am not aware of any functionality offered by postgresql. Does it exists?
If not, which solution would you prefer?