Search Postgresql Archives

Re: Best Strategy for Large Number of Images

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



> I don't currently use PostgreSQL, but I plan to migrate and I have a question 
> about the best way/strategy for storing images. I have about 2 million images
> with a growth trend of around 1 million images per year. I plan to store in bytea
> format in an isolated table. Is this recommended? Is there another better way?

Do you need to edit those images? Or do you need to make sure they are not edited, and if they are, you want to know who did it?

If so, I'd say make a versioned external repo (a git repo will do), and have your application store a chain of SHA-1s of the file commits. If anyone wants to alter your images behind your back, they'll need to gain access to both the DB and the repo, and know how you pair the two information levels (how deeply paranoid you need to be is up to your business requirements). 

If you just want to keep the images and no particular security is required, I'd say just store them in the DB. 1.000.000/year pictures are ~2/minute, if the flow is regular. As per your later posts they are not so big that they could not be managed, you don't seem to expect a lot of concurrent reads and doing it like this all you have to think of is ONE db backup/restore procedure. This might be more complex if you expect traffic peaks on the insert phase, of course.

Berto

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]

  Powered by Linux