What I usually do is to pull a limited set of records ( like 10 or 50 ) and the do the operations on them, update a column in that table to mark them completed and use JavaScript to reload the page and pull the next set out where that flag field is null. No memory issue, no need to large timeouts and it's self recovering. Bastien On Tuesday, March 16, 2010, Richard S. Crawford <richard@xxxxxxxxxxxxx> wrote: > I have a script that connects to an external database to process about 4,000 > records. Each record needs to be operated on pretty heavily, and the script > overall takes about half an hour to execute. We've hit a wall where the > script's memory usage exceeds the amount allocated to PHP. I've increased > the allotted memory to 32MB, but that's not ideal for our purposes. > > So my thought is, would it be more efficient, memory-wise, to read the > database entries into an array and process the array records, rather than > maintain the database connection for the entire run of the script. This is > not an issue I've come across before, so any thoughts would be much > appreciated. > > Thanks in advance. > > -- > Richard S. Crawford (richard@xxxxxxxxxxxxx) > http://www.underpope.com > Publisher and Editor in Chief, Daikaijuzine (http://www.daikaijuzine.com) > -- Bastien Cat, the other other white meat -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php