On Sat, 19 Jan 2002, Mitch Vincent wrote: > http://www.php.net/manual/en/function.set-time-limit.php will allow you to > set the timeout but you might post the queries and EXPLAIN output to > pg-general to see if there is something that can be done to speed them up... > Some well placed indexes always help! > > > You can always use persistent connections. It's not a good idea to use > > too many, or you will be denied access to Postgres any other way, but it's > > a quick fix for the short run. > > The maximum connections is a configuration parameter in postgresql.conf -- > persistent connections are pretty cool but have been broken until recent > versions of PHP so I'm still hesitant using them.. > It's good to see someone else sees problems with persistent connections. You could set your maximum connections to infinite, but would that really do any good? It seems the server might easily run out of memory if there isn't a cap on connection numbers,,,, as well as connection time. > > Long run and responsibility includes using an array, or an object to > > retrieve data. > > Huh? I can see why this doesn't make much sense. Seeing as we have nothing to work with from the original post regarding just what the query is.. I'll try to give an example to explain. querying all the names in a phone book DB: select first_name, last_name from white_pages; --UH... this will be,,,let's say 150,000 entries (small city phone book) WELL, you could select by starting letters, and put it into an array, using pg_fetch_array, or pg_fetch_object. Make multiple queries in the same script, and use that array (or object) to ouput to the browser... select first_name, last_name from white_pages where last_name like 'A%'; select first_name, last_name from white_pages where last_name like 'B%'; select first_name, last_name from white_pages where last_name like 'C%'; select first_name, last_name from white_pages where last_name like 'D%'; ...etc if it's still too long, try 'Aa%', 'Ab%' ... so on. Tedious, but efficient. I know this is only the sql, there is php involved. We can save that for the php list, or, if there are any further questions. Hope this clarifies at least a little bit. ?? > > >You can always split queries up into many small queries. > > What makes you say that? if you have your rdbms set up right, you have a unique number (primary key) relating to each row. So using the sql syntax above, you could select one row at a time, if you needed to, using a for loop and sending single queries to the backend. You shouldn't be flooding a webbrowser with over 30 seconds of data anyway, as you run the risk of a crash. > > -Mitch > > > > ---------------------------(end of broadcast)--------------------------- > TIP 6: Have you searched our list archives? > > http://archives.postgresql.org > Chadwick Rolfs - cmr@xxxxxxx Cleveland State University - Student Music Major - The Holden Arboretum Volunteer Computer Programmer - Student Employee --*I finally found powdered water; I just can't figure out what to add to it*--