Search Postgresql Archives

Re: How To: A large [2D] matrix, 100,000+ rows/columns

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 6/8/23 22:17, Pat Trainor wrote:
Imagine something akin to stocks, where you have a row for every stock, and a column for every stock. Except where the same stock is the row & col, a number is at each X-Y (row/column), and that is the big picture. I need to have a very large matrix to maintain & query, and if not (1,600 column limit), then how could such data be broken down to work?

 100,000 rows *
 100,000 columns *
 8 bytes (assuming float8)
= about 80 GB per matrix if I got the math correct.

Is this really a dense matrix or is it sparse? What kind of operations?

Does it really need to be stored as such or could it be stored as vectors that are converted to a matrix on the fly when needed?

Seems like using python or R makes more sense. Perhaps it might make sense to store the data in Postgres and use plpython or plr. But it is hard to say with more details.


--
Joe Conway
PostgreSQL Contributors Team
RDS Open Source Databases
Amazon Web Services: https://aws.amazon.com






[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]

  Powered by Linux