Search Postgresql Archives

Re: dynamic schema modeling and performance

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I've done the dynamic-table-per-project previously and it worked great. Even dynamic indexes on it. If low thousands it should work ok. If more than that, use as many static-columns as possible, everything dynamic in jsonb, and check stuff with per-project-constraints.

On Wed, Apr 12, 2017 at 3:31 AM, RJ Ewing <ewing.rj@xxxxxxxxx> wrote:
I thought that might be an answer around here :)

I guess I was looking for what might be a better approach. 

Is dynamically creating a table for each entity a bad idea? I can see something like creating a schema for each project (group of related entities) and then creating a table for each schema. I don't expect having more then a few thousand projects anytime soon. We have a relatively targeted audience. 

Or would it be better to use jsonb data types and create a denormalized index elsewhere?



On Apr 11, 2017, at 5:17 PM, Dorian Hoxha <dorian.hoxha@xxxxxxxxx> wrote:

If you are asking if you should go nosql, 99% you should not.

On Tue, Apr 11, 2017 at 10:06 PM, Poul Kristensen <bcc5226@xxxxxxxxx> wrote:
dataverse.org uses Postgresql and is well documented + it is completely user driven. Maybe the concept could be usefull for you. I have installed and configuration a few to be uses for researchers.

regards 
Poul   


2017-04-11 19:46 GMT+02:00 Rj Ewing <ewing.rj@xxxxxxxxx>:
I'm looking for thoughts on the best way to handle dynamic schemas.

The application I am developing revolves around user defined entities. Each entity is a tabular dataset with user defined columns and data types. Entities can also be related to each other through Parent-Child relationships. Some entities will be 100% user driven, while others (such as an entity representing a photo) will be partially user driven (all photo entities will have common fields + custom user additions).

I was hoping to get opinions on whether postgresql would be a suitable backend. A couple of options I have thought of are:

1. Each entity is represented as a table in psql. The schema would be dynamically updated (with limits) when an entity mapping is updated. I believe that this would provide the best data constraints and allow the best data normalization. A concern I have is that there could be an enormous amount of tables generated and the performance impacts this might have in the future. I could then run elasticsearch as a denormalized cache for efficient querying and full-text-search.

2. Use a nosql database. This provides the "dynamic" schema aspect. A concern here is the lack of relation support, thus leading to a more denormalized data structure and the potential for the data to become corrupted.

Any opinions on the use of psql for this case, or other options would be greatly appreciated!

RJ



--
Med venlig hilsen / Best regards
Poul Kristensen
Linux-OS/Virtualizationexpert and Oracle DBA



[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux