Il 01/03/2018 15:16, Michael Loftis ha
scritto:
I
have created a table with 301 columns(ID, 150 BIGINT,
150 TEXT). The table contains multiple form data
differentiated by ID range. Hence a column contains more
than one form data. To achieve Unique Constraint and
Indexing per form, I chose PostgreSQL Partial Indexes
which suits my requirement. I have created Partial
Indexes with ID Range as criteria and it provides
Uniqueness and Indexing per form basis as expected. But
DML operations on a particular form scans all the
Indexes created for the entire table instead of scanning
the Indexes created for that particular form ID Range.
This degrades Planner Performance and Query Time more
than 10 times as below,
Query
Result for the table with 3000 Partial Indexes(15
Indexes per form) :
This smells like you’ve failed to normalize
your data correctly. 3k indexes to ensure uniqueness ? It
sounds a lot more like you need 15 tables for 15 forms ...
... or a column that specifies, e.g., the form ID. If all form has
not the same number of BIGINT and TEXT, keep the maximum value and
fill only the requested ones.
You can also use the EAV schema, where the Entity is the form, the
Attribute is the field, and the Value... is the value.
CREATE TABLE tbl(
id bigint,
entity integer,
attribute integer, --(or string, as you need)
value_int bigint,
value_string text
);
This way you'll get more rows, but very thin, and with not more than
3 or 4 indexes (based on the querues you need to perform) you can
retrieve values quickly.
My 2 cent
Moreno.-
|