New to PostgreSQL, performance considerations

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi yall,

although I've worked with databases for more than 7 years now, I'm
petty new to PostgreSQL.

I have an application using SQLite3 as an embedded SQL solution
because it's simple and it can handle the load that *most* of my
clients have.

Because of that '*most*' part, because of the client/server way and
because of the license, I'm think about start using PostgreSQL.

My app uses only three tables: one has low read and really high write
rates, a second has high read and low write and the third one is
equally high on both.

I need a db that can handle something like 500 operations/sec
continuously. It's something like 250 writes/sec and 250 reads/sec. My
databases uses indexes.

Each table would have to handle 5 million rows/day. So I'm thinking
about creating different tables (clusters?) to different days to make
queries return faster. Am I right or there is no problem in having a
150 million (one month) rows on a table?

All my data is e-mail traffic: user's quarentine, inbond traffic,
outbond traffic, sender, recipients, subjects, attachments, etc...

What do you people say, is it possible with PostgreSQL? What kind of
hardware would I need to handle that kind of traffic?

On a first test, at a badly tunned AMD Athlon XP 1800+ (ergh!) I could
do 1400 writes/sec locally after I disabled fsync. We have UPSs, in
the last year we only had 1 power failure.

Thank you all for your tips.

Best regards,
Daniel Colchete


[Postgresql General]     [Postgresql PHP]     [PHP Users]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Yosemite]

  Powered by Linux