Search Postgresql Archives

Strategies/Best Practises Handling Large Tables

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi,

I currently have a table that is growing very quickly - i.e 7 million records in 5 days. This table acts as a placeholder for statistics, and hence the records are merely inserted and never updated or deleted.

Many queries are run on this table to obtain trend analysis. However, these queries are now starting to take a very long time (hours) to execute due to the size of the table.

I have put indexes on this table, to no significant benefit.  Some of the other strategies I have thought of:
1. Purge old data
2. Reindex
3. Partition
4. Creation of daily, monthly, yearly summary tables that contains aggregated data specific to the statistics required

Does anyone know what is the best practice to handle this situation?

I would appreciate knowledge sharing on the pros and cons of the above, or if there are any other strategies that I could put in place.

Thanking you in advance.


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux