Re: Hi Community

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi Everyone, 

Thank you all for your reply ! I really appreciate it and be pleased to be a part of such a great community of enthusiast people.

 @steve: I replied inline for your questions.

On 3 February 2015 at 00:57, Steve Crawford <scrawford@xxxxxxxxxxxxxxxxxxxx> wrote:
On 02/02/2015 03:18 AM, Naresh Soni wrote:
Good Evening Community

This is my first question on the list, I wanted to ask if postgres can handle multi millions records?  for example there will be  1 million records per table per day, so 365 millions per year.

Is yes, then please elaborate. 


Yes. Maybe. But you will need to elaborate. (A plain text file can store millions of records, too. The use pattern is what matters.)

For reference, I have a modest database of about 350-million records running on modest hardware of a several-year-old 4-core/24GB RAM/4-drive RAID10 with battery-backup server and get mostly acceptable performance with query peaks well over 1,000 TPS at about 2/3 writes.

The people here can give you lots of good advice with purchasing the PostgreSQL Performance book being a good one. PostgreSQL is feature-rich but you will need to design and choose appropriately. To get started a few questions jump out:

  • You say 1-million/day/table. How many tables (i.e. what is the total record count)?
This is only 1 table in the database which will have such huge data i.e 1 million data per day .  no doubt there are other tables in the database but the records are limited.

  • How long do you want to keep the data (total anticipated database size)?

I plan to use new database every year. 
  • Can the data be easily partitioned? For example, can you set up child tables by month for query and archive efficiency?

Don't  think so. 
  • What sort of query mix do you anticipate? What portion of inserts, updates, selects,...? Lots of data-warehouse type queries?

Yes, there will be complex queries as this table will be used for reporting purpose. In short this table will contain Financial Invoices. 
  • Is the schema fairly simple or are there lots of complex foreign-key constraints?
This table has 1 primary and 15 foreign key constraints.

  • What sort of performance do you need - fraction-of-a-second transactional response or overnight analytical runs?
A good response time expected when data is being fetched. 

  • What hardware can you throw at it? At one end I've run PostgreSQL on a Raspberry Pi. At the other end, I'm about to migrate a database to a 20-core/128GB machine with Intel SSD storage that can purportedly match the TPS of 2,000 15k drives. (This is certainly not the upper limit of hardware for running PostgreSQL). Each is appropriate for its use-case.


Not planned yet.
 
In any case, don't let the number of options and features overwhelm you - you don't need to optimize prematurely. Describe your use-case here for some starting suggestions, get familiar with it, and adjust your design and settings as necessary.

Welcome to the list.

Thank you so much for your help !
 
Cheers,
Steve




--
Regards,
Naresh Soni





[Index of Archives]     [KVM ARM]     [KVM ia64]     [KVM ppc]     [Virtualization Tools]     [Spice Development]     [Libvirt]     [Libvirt Users]     [Linux USB Devel]     [Linux Audio Users]     [Yosemite Questions]     [Linux Kernel]     [Linux SCSI]     [XFree86]

  Powered by Linux