Re: Big image tables maintenance

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 09/17/2018 07:38 AM, still Learner wrote:
Hi ,

I have a 10 TB size table with multiple bytea columns (image & doc)and makes 20TB of DB size. I have a couple of issues to maintain the DB.

1. I Would like to separate the image column from the 10TB size table, place it in a separate schema. The change should not result in any query change in the application.  Is it possible? Doing this it should not affect the performance.

That's called "vertical partitioning", which I don't think Postgres supports.


2. I can't maintain files on File system as the count is huge,

Eh?  You aren't supposed to maintain the files on the filesystem; Postgres is.

so thinking of using any no-sql mostly mongo-DB, is it recommended? Or PostgreSQL itself can handle? 

3. Taking the backup of 20TB data, is big task. Any more feasible solution other than online backup/pg_dump?

pgbackrest and barman are popular options.

(We have a database like yours, though only 3TB, and have found that pg_dump runs a lot faster with "--compress=0".  The backups are 2.25x larger than the database, though...)


Each image retrieval is 
Currently, we are on pg 9.4 and moving to 10.5 soon.
 
Thanks,
GJ.

--
Angular momentum makes the world go 'round.

[Index of Archives]     [KVM ARM]     [KVM ia64]     [KVM ppc]     [Virtualization Tools]     [Spice Development]     [Libvirt]     [Libvirt Users]     [Linux USB Devel]     [Linux Audio Users]     [Yosemite Questions]     [Linux Kernel]     [Linux SCSI]     [XFree86]

  Powered by Linux