[Ppsted similar note to PG General but I suppose it's more appropriate in this list. Apologies for cross-posting.] Hi. Further to my bafflement with the "count(*)" queries as described in this thread: http://archives.postgresql.org/pgsql-general/2009-01/msg00804.php It seems that whenever this question has come up, Postgresql comes up very short in terms of "count(*)" functions. The performance is always slow, because of the planner's need to guess and such. I don't fully understand how the statistics work (and the explanation on the PG website is way too geeky) but he columns I work with already have a stat level of 100. Not helping at all. We are now considering a web based logging functionality for users of our website. This means the table could be heavily INSERTed into. We get about 10 million hits a day, and I'm guessing that we will have to keep this data around for a while. My question: with that kind of volume and the underlying aggregation functions (by product id, dates, possibly IP addresses or at least countries of origin..) will PG ever be a good choice? Or should I be looking at some other kind of tools? I wonder if OLAP tools would be overkill for something that needs to look like a barebones version of google analytics limited to our site.. Appreciate any thoughts. If possible I would prefer to tone down any requests for MySQL and such! Thanks! -- Sent via pgsql-performance mailing list (pgsql-performance@xxxxxxxxxxxxxx) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-performance