On Sat, Sep 1, 2012 at 5:42 AM, Edson Richter <edsonrichter@xxxxxxxxxxx> wrote:
It's an interesting thing.
We have a product that runs over PostgreSQL without any problems (well, we have few, but most of them can be worked around).
Nevertheless, when we present our product to customers, they won't get satisfied until we guarantee we can run same product with major paid versions (Oracle, MS SQL, and so on).
We assert to them that PostgreSQL works as good as any other (paid) databases, and even better. After that (knowing that they have a choice), they won't question any more, and they use PostgreSQL without any concerns.
Seems that people (managers) that don't understand the technical stuff need to know that they have a fall back to a paid version (the one that they can blame if something goes wrong).
Thankfully, our product running over PostgreSQL never stoped in 5 years of development in any of our customers. Now, I cannot tell the same about MS SQL Server and MySQL, that had several problems regarding database structure, and DB2 that suffers of constant DBA maintenance for performance as the application grows too fast.
I have been thinking about this phenomenon a lot. I don't run into it as much as others probably because what I think is out there and so people don't ask, but the question is why this comes up so much. Here is my theory and it is worth bringing up here because it does have a bearing on the original question.
The database market has traditionally been dominated by big-cost alternatives, which tend to require substantial investments in per server and per user licensing (usually together) and in expertise. For this reason businesses have reasonably chosen to centralize all systems on one system, whether it is Oracle, MS SQL, DB2, Informix, etc. This saves costs and it reduces complexity in the IT environment. It seems like a winning strategy.
In actuality however the main thing this does it it separates commercial, off the shelf apps from internal and specialized apps. The former want to reach a larger market and the only way they can do this is to program in a way that is portable across databases, meaning that everything gets done in standard SQL and advanced features are ignored. Internal apps, and those specializing in markets where they can limit themselves to one db, tend to use advanced features. However the app developers for commercial apps all try to control access to the db because that is where their gold is, so most of these are, the developer hopes, only accessed by the licensed app. In many cases I know of applications whose EULA's forbid third party apps from accessing the application's database.
Where MySQL comes in is that after content management they became a database *just good enough* to handle this one application per db scenario and all the things that make the db horrible when 30 apps are writing to it are features for the one app per db with portable SQL model. MySQL's big weakness here is actually its strength when it comes to its business model.
So the difficulty is that unless IT departments are willing to accept multiple RDBMS's in their environment, you will end up with applications coded in a style that's best described as "we'd use NoSQL but we want some ad hoc reporting."
The thing about PostgreSQL is it is not, and will never be, the lowest common denominator database any more than Oracle will be. We aren't highly specialized like Vertica or VoltDB. We are an excellent generalist database which can be used for really advanced data modelling, and we are rock solid behavior-wise at least if you stay away from the undefined fringe.
Best Wishes,
Chris Travers