Hi, SunWuKung wrote:
Hi, I found this to create dynamic crosstabs (where the resulting columns
...
This could work although for hundreds of columns it looks a bit scary for me.
Well I'd say hundreds of columns are always scary, no matter how you do it :-) ...
I know that most db people don't care much about pivot/crosstab in the db but imagine this situation: I am storing questionnaire results on people. Since the questionnaires are created by users I have no other way than using an EAV model like personID, questionID, responseValue to store responses. Now this table gets long 300 question per questionnaire, 3000 people and we have 1m row. Now whenever I need to download this data in my case 2/3rd of it would be redundant if I could pivot it first - and in a 20MB csv its significant (I know its a tradeoff between processing and storage). Moreover my users can't do anything with this dataformat - they need to pivot it offline anyway, which is not easy (Excel cant do it, Access cant do it, numberGo cant do it for different reasons).
What about not pivoting it? You can run your analysis directly against your database.
Although the application could do it I think this is a generic functionality that the database is more suited for.
Well after all you want a CSV not a table. You could shortcut this with a generic query which creates array out of your "columns" and join them to a CSV line. This would just be outputted as one single column from database.
Please let me know if you know of a good db based way to create a dynamic crosstab in Postgres - or why there shouldn't be one.
See above :-) Regards Tino ---------------------------(end of broadcast)--------------------------- TIP 4: Have you searched our list archives? http://archives.postgresql.org/