* Keary Suska <hierophant@xxxxxxxxxx> [18 05 02 20:32]: >on 5/18/02 6:09 AM, andrew@xxxxxxxxxxxxxxx purportedly said: > >>> I used persistent connections, however, I got lots of <defunct> >>> processes. >>> If the CURSOR is closed when the script has finished (pg_connect), >>> it makes no sense (for me) using it, because I can not move within >>> the results. >> >> Right, so your problem is you're trying to use CURSOR, which you are >> leaving open on the off-chance your visitor will returns and wants to >> see the 'next record' or something. >> >> I think that really you just should bite the bullet and use the better >> syntax on SELECT. You may have a little more programming work to do, >> but it has little impact on performance as far as I can see. >> >> PostgreSQL will optimise queries differently when they have LIMIT >> clauses too. > >I'm afraid Andrew is correct here--with persistent connections, you have no >control, over when the connection is closed, and what connection you are >using at any one point. Thus you would still have to keep track of where the >cursor is, and furthermore you would have to check for the existence of the >cursor on every script invocation, which I don't believe there is a command >for, so you would have to try to use the cursor and then capture any error. >In the case of an error, you would need to re-define the cursor and move to >the current point according to the stored cursor location info. > >Depending on your program logic, this could be more difficult than using >LIMIT. Using LIMIT has the added benefit of avoiding the overhead of many >unused cursors, which can build up quickly as you also have little or no >control over the cursors since cursors are tied to the connection. > >The drawback to using LIMIT is that you are executing the query on every >call, and Postgres cannot optimize on a LIMIT as was mentioned, because the >entire query has to be collected before the LIMIT can be applied. However, >IIRC, Postgres does query result caching, so subsequent calls on the same >query will tend to be faster, providing there is enough memory allocated to >support it. > That's what I assumed, because the HTTP-protocol does not allow to establish persistent connections. The web server is sending the requested ducument and closes the connection. Each new request is a new connection as regards to the http protocol. If you are using non persistent database connections, the CURSOR is closed when the script has finished. If you are using persisten database connections you can establish an individual CURSOR for each user - using for example the user_id as name for the CURSOR (or a random number). Then, however, the user may shut down the browser and we still have the CURSOR. Thanks for your help. -- --//--\\-- Eckhard Hoeffner e-hoeffner@xxxxxxxxxxx Tal 44 D-80331 München