On May 2, 2008, at 2:01 PM, finecur wrote:
Hi, I am ruuning a database behind a webserver and there is a table which is huge. I need to pull data from this table and send to user through http. If I use select * from huge_table where userid = 100 It will return millions of records which exhuasts my server's memory. So I do this: select * from huge_table where userid = 100 limit 1000 offset 0 and then send the results to user, then select * from huge_table where userid = 100 limit 1000 offset 1000 and then send the results to user, then select * from huge_table where userid = 100 limit 1000 offset 2000 and then send the results to user, Continue this until there is no records available It runs great but it is kind of slow. I think it is because even I need only 1000 records, the query search the whole table every time.
Not quite - if you do a "limit 1000 offset 5000" it'll stop after retrieving
the first 6000 from the table. A bigger problem with doing it this way is that the results aren't particularly well defined unless there's an order by statement in the query.
Is there a better way to do this?
You want a cursor. See http://www.postgresql.org/docs/8.3/static/sql-declare.html Cheers, Steve