Re: Dealing with big tables

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



> my answer may be out of topic since you might be looking for a
> postgres-only solution.. But just in case....

  I'd like to stay with SQL.

> What are you trying to achieve exactly ? Is there any way you could
> re-work your algorithms to avoid selects and use a sequential scan
> (consider your postgres data as one big file) to retrieve each of the
> rows, analyze / compute them (possibly in a distributed manner), and
> join the results at the end ?

  I'm trying to improve performance - get answer from mentioned query 
faster.

  And since cardinality is high (100000+ different values) I doubt that it 
would be possible to reach select speed with reasonable number of nodes of 
sequential scan nodes.

  Mindaugas

---------------------------(end of broadcast)---------------------------
TIP 5: don't forget to increase your free space map settings


[Postgresql General]     [Postgresql PHP]     [PHP Users]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Yosemite]

  Powered by Linux