2016-06-07 15:03 GMT+02:00 Josh Berkus <josh@xxxxxxxxxxxx>:
On 06/07/2016 08:42 AM, Nicolas Paris wrote:
> You have to do something different. Using multiple columns and/or
> multiple rows might we workable.
Getting a unique document from multiple rows coming from postgresql is not that easy... The external tools considers each postgresql JSON fields as strings or have to parse it again. Parsing them would add an overhead on the external tool, and I d'say this would be better to build the entire JSON in the external tool. This leads not to use postgresql JSON builder at all, and delegate this job to a tool that is able to deal with > 1GO documents.
>
>
> Certainly. Kind of disappointing, because I won't find any json builder
> as performant as postgresql.
That's nice to hear.
> Will this 1GO restriction is supposed to increase in a near future ?
Not planned, no. Thing is, that's the limit for a field in general, not
just JSON; changing it would be a fairly large patch. It's desireable,
but AFAIK nobody is working on it.
Comparing to mongoDB 16MO document limitation 1GO is great (http://tech.tulentsev.com/2014/02/limitations-of-mongodb/). But for my use case this is not sufficient.
--
--
Josh Berkus
Red Hat OSAS
(any opinions are my own)