Search Postgresql Archives

Re: Very large tables

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 





On Fri, Nov 28, 2008 at 3:48 PM, Alvaro Herrera <alvherre@xxxxxxxxxxxxxxxxx> wrote:
William Temperley escribió:

> I've been asked to store a grid of 1.5 million geographical locations,
> fine. However, associated with each point are 288 months, and
> associated with each month are 500 float values (a distribution
> curve), i.e. 1,500,000 * 288 * 500 = 216 billion values :).
>
> So a 216 billion row table is probably out of the question. I was
> considering storing the 500 floats as bytea.

What about a float array, float[]?
you seriously don't want to use bytea to store anything, especially if the datatype matching exists in db of choice.
also, consider partitioning it :)

Try to follow rules of normalization, as with that sort of data - less storage space used, the better :)
And well, I would look for a machine with rather fast raid storage :) (and spacious too).



--
GJ

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux