Search Postgresql Archives

Re: Thanks for insights into internal design

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



That's fine for a system like access or DBASE, but you should never be making queries that large for
a production application.
Access or DBASE or any other local FILE based system will not have any problems bringing back 1 million
records because it does not have to bring the records across the wire via TCP/IP.

You should alway limit queries by a date range or at least implement a paging system.
250,000 to 1 million rows is also going to suck up a huge amount of system memory on the client side.

It does not seem like you are really catching on to the concept of a client/server based system.
It does not matter if there is a billion rows because you should NEVER be letting a end user bring back
the full amount anyway.  Think about it.

Postgresql is not a local file based system like Access or Dbase, you can't use the same testing methods
or you will be in for a world of hurt.


You give me valuable insight into the inner workings of such software. I am a firm believer in testing everything with very large files. One might spend months developing something, and have it in production for a year, and not realize what will happen when their files (tables) grow to several million records (rows). And it take so little effort to create large test files.

---------------------------(end of broadcast)--------------------------- TIP 7: don't forget to increase your free space map settings

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux