We have a couple of tables where we like to store very large JSON objects in a JSONB column. We don't use these in, err, real time kind of situations, intensive queries, etc, so I'm not concerned about their performance necessarily, it's more just to store vendor data that's parsed as we go into smaller more usable tables, and so they serve as the historical record, and occasionally we re-process.
I'm wondering if there's some smarter solution with Postgres, these usually end up in TOAST tables as I understand it, so the actual table is quite small and then there's some associated TOAST thing that grows and grows (the given one right now is 500GB).
I don't.. really... care, I guess, that the TOAST tables are that large, but I thought I might do the diligence here and ask if there's some different solution, or if this is fine, or some kind of ticking time bomb I am understanding poorly.
We like these in the database as we have SQL scripts to parse them, so saving them as files in S3 and rewriting ETL in python or whatever kind sounds awful.