On May 31, armand pirvu modulated: > The idea is that I would like to avoid having an index for each key > possibly wanted to search, but rather say have the whole record then > search by the key and get the key value, thus having one index serving > multiple purposes so to speak > First, benchmarking would be important to figure out if any proposed indexing actually speeds up the kinds of queries you want to perform. With the recently added parallel query features, a simpler indexing scheme with some brute-force search might be adequate? But, you could use a search idiom like this: (lower(json_column::text)::json) -> lower('key') = 'value'::json This will down-convert the case on all values and keys. The left-hand parenthetic expression could be precomputed in an expression index to avoid repeated case conversion. But, typical searches will still have to scan the whole index to perform the projection and match the final value tests on the right-hand side. If you want to do things like substring matching on field values, you might stick with text and using regexp matches: (lower(json_column::text)) ~ 'valuepattern' or more structural searches: (lower(json_column::text)) ~ '"key": "[^"]*substring[^"]*"' Here, the left-hand expression could be trigram indexed to help with sparse, substring matching without a full index scan. We've had good luck using trigram indexing with regexp matching, though I've honestly never used it for the purpose sketched above... Karl -- Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-general