Hi, Is Postgres supposed to be able to handle concurrent requests while doing large updates? This morning I was executing the following simple update statement that would affect 220,000 rows in my product table: update product set is_hungry = 'true' where date_modified > current_date - 10; But the application that accesses the product table for reading became very unresponsive while the update was happening. Is it just a matter of slow I/O? The CPU usage seemed very low (less than 5%) and iostat showed less than 1 MB / sec throughput. I was doing the update in psql. Are there any settings that I could tweak that would help with this sort of thing? Thanks, ____________________________________________________________________ Brendan Duddridge | CTO | 403-277-5591 x24 | brendan@xxxxxxxxxxxxxx ClickSpace Interactive Inc. Suite L100, 239 - 10th Ave. SE Calgary, AB T2G 0V9 http://www.clickspace.com |