So this seems to be because the result size is too big. I still don't know why it is looping through every record and printing a warning, but adding a LIMIT makes the queries complete in a reasonable time (although not all that fast). However I need to sort and also have many other facets that may or may not be included in the query. Adding a sort makes it load every record again and take forever. I tried to create an index including all of the fields I query on to see if that would work, but I get an error the the index row is too large: => create index master_index on source_listings(geo_lat, geo_lon, price, bedrooms, region, city, listing_type, to_tsvector('english', full_listing), post_time); NOTICE: word is too long to be indexed DETAIL: Words longer than 2047 characters are ignored. NOTICE: word is too long to be indexed DETAIL: Words longer than 2047 characters are ignored. NOTICE: word is too long to be indexed DETAIL: Words longer than 2047 characters are ignored. NOTICE: word is too long to be indexed DETAIL: Words longer than 2047 characters are ignored. ERROR: index row requires 13356 bytes, maximum size is 8191 Any ideas about how to resolve this? -- Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-general