Tobias Brox wrote:
[nospam@xxxxxxxxxxxx - Thu at 06:37:12PM -0600]
As my dataset has gotten larger I have had to throw more metal at the
problem, but I have also had to rethink my table and query design. Just
because your data set grows linearly does NOT mean that the performance of
your query is guaranteed to grow linearly! A sloppy query that runs OK
with 3000 rows in your table may choke horribly when you hit 50000.
Then some limit is hit ... either the memory cache, or that the planner
is doing an unlucky change of strategy when hitting 50000.
Not really. A bad query is a bad query (eg missing a join element). It
won't show up for 3000 rows, but will very quickly if you increase that
by a reasonable amount. Even as simple as a missing index on a join
column won't show up for a small dataset but will for a larger one.
It's a pretty common mistake to assume that a small dataset will behave
exactly the same as a larger one - not always the case.
--
Postgresql & php tutorials
http://www.designmagick.com/