On 5/2/07, Harpreet Dhaliwal <harpreet.dhaliwal01@xxxxxxxxx> wrote:
I'm kind of new to postgresql and the project that I'm working on currently deals with parsing emails, storing parsed components in postgresql DB and fire triggers on certain inserts that opens socket connection with a unix tools server,
Are you sure it is a good idea to do this processing synchronously? What happens if there is a network problem? It sounds like an inefficient and inflexible design.
I have done alot of homework on this and could think of something like "bulk of data storage in email parsing and how vacuuming it would increase the performance" because i think this vacuum DB concept is not there in other RDBMS.
SQLite also requires vacuuming, as does other databases based on MVCC-like designs, although some (eg., Oracle with its redo logs, iirc) do their housekeeping behind the scenes. Alexander.