hi@ll
currently i try to implement a data structure for tracking my websites traffic
i'm using a key/value table which contains e.g. http header and/or GET/POST data
a request table (1 request per row) references to the key/value pairs using an bigint[] array
the problem seems to be that query's on the array are very slow
what i tried to achive:
store every request and it's metadata in the database to generate statistics
since one request can have 1 to many key/value pairs i try to "artificially" broaden the table by using an array to reference to the request's metadata
advantage:
the metadata table grows slowly and querys are fast on this table
disadvantage:
request table grows very fast and timely statistics on time periods are very slow (e.g. how many users used a proxy yesterday,...)
Q1: is there an easy possibility to create an index for the array?
Q2: if yes, how? are there any special query functions for using such indexes?
Q3: anybody knows an alternative way to store such data without arrays?
Q4: what would be a (in sense of performance) fast alternative (w.o. arrays)?
Q5: what would be a space saving (e.g. slow growing table) alternative (w.o. arrays)?
thx@ll