Search Postgresql Archives

using a postgres table as a multi-writer multi-updater queue

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi All,

I wondered if any of you could recommend best practices for using a postgres table as a queue. Roughly speaking, 100-200 workers will vomit rows and rates of a few hundres per second into the table leaving the status as new and then as many workers as needed to keep up with the load will plough through the queue changing the status to something other than new.

My naive implementation would be something along the lines of:

CREATE TABLE event (
    ts        timestamp,
    event     char(40),
    status    char(10),
    CONSTRAINT pkey PRIMARY KEY(ts, event)
);

...with writers doing INSERT or COPY to get data into the table and readers doing something like:
SELECT FOR UPDATE * FROM event WHERE status='new' LIMIT 1000;
...so, grabbing batches of 1,000, working on them and then setting their status.

But, am I correct in thinking that SELECT FOR UPDATE will not prevent multiple workers selecting the same rows?

Anyway, is this approach reasonable? If so, what tweaks/optimisations should I be looking to make?

If it's totally wrong, how should I be looking to approach the problem?

cheers,

Chris

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux