I think I found the problem. I was comparing wrongly some values and based on that, every time the script was run (that means once every 5 minutes) my script deleted two tables and populated them with about 70 thousand records.
I still don't know why that affected the speed of the database (even when the script was not running) and how to fix it. I have a script that vacuums full and reindexes the database every day. Is there something else I must do?
I'm not sure I understand what you're saying, but if you vacuum at the wrong time that can cause problems. I've shot myself in the foot before now doing something like:
DELETE FROM big_table VACUUM ANALYSE big_table COPY lots of rows into big_table
Of course, the planner now thinks there are zero rows in big_table. -- Richard Huxton Archonet Ltd
---------------------------(end of broadcast)--------------------------- TIP 8: explain analyze is your friend