Ioannis Anagnostopoulos <ioannis@xxxxxxxxxx> writes: > I think this is a pretty good plan and quite quick given the > size of the table (88Million rows at present). However in real > life the parameter where I search for msg_id is not an array of > 3 ids but of 300.000 or more. It is then that the query forgets > the plan and goes to sequential scan. Is there any way around? If you've got that many, any(array[....]) is a bad choice. I'd try putting the IDs into a VALUES(...) list, or even a temporary table, and then writing the query as a join. It is a serious mistake to think that a seqscan is evil when you're dealing with joining that many rows, btw. What you should probably be looking for is a hash join plan. regards, tom lane -- Sent via pgsql-performance mailing list (pgsql-performance@xxxxxxxxxxxxxx) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-performance