Re: Storing large large JSON objects in JSONB

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Sat, 2022-07-09 at 12:37 -0700, Wells Oliver wrote:
> We have a couple of tables where we like to store very large JSON objects
> in a JSONB column. We don't use these in, err, real time kind of situations,
> intensive queries, etc, so I'm not concerned about their performance
> necessarily, it's more just to store vendor data that's parsed as we go into
> smaller more usable tables, and so they serve as the historical record,
> and occasionally we re-process.
> 
> I'm wondering if there's some smarter solution with Postgres, these usually
> end up in TOAST tables as I understand it, so the actual table is quite small
> and then there's some associated TOAST thing that grows and grows (the given
> one right now is 500GB).
> 
> I don't.. really... care, I guess, that the TOAST tables are that large,
> but I thought I might do the diligence here and ask if there's some different
> solution, or if this is fine, or some kind of ticking time bomb I am
> understanding poorly.
> 
> We like these in the database as we have SQL scripts to parse them, so saving
> them as files in S3 and rewriting ETL in python or whatever kind sounds awful.

I know of no problem this could cause, other than that if your database grows
very large, it may be a pain to backup.

Yours,
Laurenz Albe
-- 
Cybertec | https://www.cybertec-postgresql.com





[Index of Archives]     [Postgresql Home]     [Postgresql General]     [Postgresql Performance]     [Postgresql PHP]     [Postgresql Jobs]     [PHP Users]     [PHP Databases]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Databases]     [Yosemite Forum]

  Powered by Linux