Search Postgresql Archives

When should I worry?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I've started a database that's doing wonderfully and I'm watching the tables grow and a steady clip.

Performance is great, indexes are nice, sql costs are low. As far as I can tell, I've done a respectable job of setting up the database, tables, sequence, indexes...


But a little math tells me that I have one table that's particularly ugly.

This is for a total of 6 users.

If the user base gets to 100 or more, I'll be hitting a billion rows before too long. I add about 70,000 rows per user per day. At 100 users this is 7 million rows per day. I'll hit a billion in 142 days, call it six months for simplicity.


The table itself is small (two columns: bigint, int) but I'm wondering when I'll start to hit a knee in performance and how I can monitor that. I know where I work (day job) they have Oracle tables with a billion rows that just plain suck. I don't know if a billion is bad or if the DBA's were not given the right opportunity to make their tables work.

But if they are any indication, I'll feeling some hurt when I exceed a billion rows. Am I going to just fold up and die in six months?

I can't really expect anyone to have an answer regarding hardware, table size, performance speeds ... but is there some way I can either monitor for this or estimate it before it happens?


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux