Matthew Wakeling <matthew@xxxxxxxxxxx> writes: > In order to improve the performance, I made the system look ahead in the > source, in groups of a thousand entries, so instead of running: > SELECT * FROM table WHERE field = 'something'; > a thousand times, we now run: > SELECT * FROM table WHERE field IN ('something', 'something else'...); > with a thousand things in the IN. Very simple query. It does run faster > than the individual queries, but it still takes quite a while. Here is an > example query: Your example shows the IN-list as being sorted, but I wonder whether you actually are sorting the items in practice? If not, you might try that to improve locality of access to the index. Also, parsing/planning time could be part of your problem here with 1000 things to look at. Can you adjust your client code to use a prepared query? I'd try SELECT * FROM table WHERE field = ANY($1::text[]) (or whatever the field datatype actually is) and then push the list over as a single parameter value using array syntax. You might find that it scales to much larger IN-lists that way. regards, tom lane