Re: storing binary files / memory limit

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

On Sat, 2008-10-11 at 18:26 +0200, Tomas Vondra wrote:
> Is there any other way to solve storing of large files in PostgreSQL? 

No, not until there are functions that let you fopen() on the bytea

Also, your "... || more_column" solution will generate large numbers of
dead rows and require frequent vacuuming.

> - Optimization is a serious criterion, as is reliability.

If you're using tables with very large columns, make sure you index on
every other column you're going to access it by.  If PostgreSQL has to
resort to full-table scans on this table, and especially with a low
memory constraint, you could easily end up with it doing an on-disk sort
on a copy of the data.

If you *have* to store it in a table column (and it really isn't the
most efficient way of doing it) then create a separate table for it
which is just SERIAL + data.

					Andrew McMillan.
Andrew @ McMillan .Net .NZ                         Porirua, New Zealand                    Phone: +64(272)DEBIAN
   It is often easier to tame a wild idea than to breathe life into a
                        dull one. -- Alex Osborn


[Index of Archives]     [Postgresql General]     [Postgresql Admin]     [PHP Users]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Databases]     [Yosemite Backpacking]     [Postgresql Jobs]

  Powered by Linux