Search Postgresql Archives

Design question

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi all,

I've got a design question that I need to ask before I go too far down what 
might be the wrong road.

I've got a customer, who has multiple customers, who need to be able to upload 
an excel spreadsheet into Postgres.  Then they want to be able to slice and 
dice that data.

The problem is that probably none of these spreadsheets will have the same 
fields in them.

There are two ways to do this, that I can think of...

1.  Create a table for each spreadsheet, using column headings as field names.  
Every field would be a char/varchar.  We might have a table to track which 
client owns which table.  This could amount to 10's of tables being added to 
the db.

2.  Create a table in which we store individual cells and associate them with 
an owner.  Then each client would essentially have one (huge?) table that 
they can work with.

Design #1 is easy to implement, but might make management more difficult.  
Design #2 is easy to manage, but the SQL needed to generate reports would 
be "tricky."  I'm intending to provide a report generator, so the complexity 
of the reporting SQL can be mitigated.

So, which road should I travel down?

TIA,

-- 
Mike Diehl


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux