Search Postgresql Archives

Re: very slow after a while...

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Costin Manda wrote:
On Wed, 06 Apr 2005 14:07:36 +0100
Richard Huxton <dev@xxxxxxxxxxxx> wrote:


Costin Manda wrote:

 I think I found the problem. I was comparing wrongly some values and
based on that, every time the script was run (that means once every 5
minutes) my script deleted two tables and populated them with about 70
thousand records.


I'm not sure I understand what you're saying, but if you vacuum at the wrong time that can cause problems. I've shot myself in the foot before now doing something like:

DELETE FROM big_table
VACUUM ANALYSE big_table
COPY lots of rows into big_table

Of course, the planner now thinks there are zero rows in big_table.


I mean from 5 to 5 minutes DROP TABLE
CREATE TABLE
INSERT 70000 rows in table

I thought you were trying an inserting / updating if it failed? You shouldn't have any duplicates if the table was already empty. Or have I misunderstood?


--
  Richard Huxton
  Archonet Ltd

---------------------------(end of broadcast)---------------------------
TIP 1: subscribe and unsubscribe commands go to majordomo@xxxxxxxxxxxxxx

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux