Hi All,
I'm curious if there are recommendations for scaling postgres to what,
for me, seems like "a lot" of data...
The app in question currently writes around 1.5 billion rows into a
table before rolling them up into tables that have a few million roll up
rows each. That 1.5 billion row table is emptied and refilled each day,
so we're talking about quite high write as well as quite high read.
Where can I find could examples/docs of how to scale postgres for this
kind of data load? What sort of hardware would I be looking to spec?
Okay, now this app may well eventually want to progress to storing those
1.5 billion rows per day. Is that feasible with postgres? If not, what
storage and processing solutions would people recommend for that kind of
data load?
cheers,
Chris
--
Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general