On Thu, Mar 1, 2018 at 03:10 Meenatchi Sandanam <meen.opm@xxxxxxxxx> wrote:
I have created a table with 301 columns(ID, 150 BIGINT, 150 TEXT). The table contains multiple form data differentiated by ID range. Hence a column contains more than one form data. To achieve Unique Constraint and Indexing per form, I chose PostgreSQL Partial Indexes which suits my requirement. I have created Partial Indexes with ID Range as criteria and it provides Uniqueness and Indexing per form basis as expected. But DML operations on a particular form scans all the Indexes created for the entire table instead of scanning the Indexes created for that particular form ID Range. This degrades Planner Performance and Query Time more than 10 times as below,
Query Result for the table with 3000 Partial Indexes(15 Indexes per form) :
This smells like you’ve failed to normalize your data correctly. 3k indexes to ensure uniqueness ? It sounds a lot more like you need 15 tables for 15 forms ... perhaps with a view for reading or maybe 1/15th of the columns to begin with by having a form_type column...or perhaps like an index function for the unique constraint....such that the output of the function is the normalized portion of data that’s required to be unique....
If you’ve really got 3k different uniqueness criteria differing by “id” ranges then it sounds like an _expression_ index with a function spitting out the hash of uniqueness but that’d still be hairy, at least you wouldn’t eat the time on every read though. But I’d reduce that id range based problem to include a unique_type indicator column instead.
"Genius might be described as a supreme capacity for getting its possessors
into trouble of all kinds."
-- Samuel Butler