That's fine for a system like access or DBASE, but you should never be making queries that large for
a production application.
Access or DBASE or any other local FILE based system will not have any problems bringing back 1 million
records because it does not have to bring the records across the wire via TCP/IP.
You should alway limit queries by a date range or at least implement a paging system.
250,000 to 1 million rows is also going to suck up a huge amount of system memory on the client side.
It does not seem like you are really catching on to the concept of a client/server based system.
It does not matter if there is a billion rows because you should NEVER be letting a end user bring back
the full amount anyway. Think about it.
Postgresql is not a local file based system like Access or Dbase, you can't use the same testing methods
or you will be in for a world of hurt.
You give me valuable insight into the inner workings of such software.
I am a firm believer in testing everything with very large files. One
might spend months developing something, and have it in production for a
year, and not realize what will happen when their files (tables) grow to
several million records (rows). And it take so little effort to create
large test files.
---------------------------(end of broadcast)---------------------------
TIP 7: don't forget to increase your free space map settings