Well, just an update, looks like it was me complicating things. I returned to the original code and just updated references like linkage and table names and it worked. Looks like what I missed was that the prior dev had imploded some of those arrays as a SQL strings then dumped the arrays. I looked right past that part and had just created new arrays. Smart guy. Took an array and made it a string. Less overhead! ;) Thanks for the input all. Best, Karl DeSaulniers Design Drumm http://designdrumm.com On Oct 11, 2015, at 9:48 PM, Karl DeSaulniers <karl@xxxxxxxxxxxxxxx> wrote: > On Oct 11, 2015, at 9:41 PM, Gener Badenas <gener.ong.badenas@xxxxxxxxx> wrote: > >> >> >> On Mon, Oct 12, 2015 at 10:27 AM, Karl DeSaulniers <karl@xxxxxxxxxxxxxxx> wrote: >> On Oct 11, 2015, at 9:21 PM, Aziz Saleh <azizsaleh@xxxxxxxxx> wrote: >> >> > >> > >> > On Sun, Oct 11, 2015 at 10:14 PM, Karl DeSaulniers <karl@xxxxxxxxxxxxxxx> wrote: >> > Getting this error: >> > Fatal error: Allowed memory size of 103809024 bytes exhausted (tried to allocate 523800 bytes) in [...]/html/wp-includes/cache.php on line 113 >> >> Could it be that other wordpress plugins are consuming the memory? Or just your custom code? >> >> > >> > It is being generated by a script that reads a excel file and then updates a database with the new info. >> > I have went in and unset all arrays that were set along the way to alleviate to no avail. Got down to 9 bytes once though. >> > However, my question isn't about allocating memory. I know how to do that server side, but I want to avoid that. >> > >> > This is a simple order database with fields filled with order info. Nothing complicated. >> > What I am wondering is what is the best way to avoid this memory issue knowing there will be a lot of entries to go through. >> > To read each field from the spread sheet and insert individually? Read all fields from spread sheet and then insert individually? >> > Any known ways of reading all and inserting all at the same time? Suggestions? >> > >> > TIA, >> > Best, >> > >> > Karl DeSaulniers >> > Design Drumm >> > http://designdrumm.com >> > >> > >> > >> > >> > To avoid memory issues, I would do each row individually, reading the file line by line, using something like fgetcsv/fgets. >> > >> > If you attempt the read entire file and insert, you would be a hostage to the file size of the file. Doing it line by line way avoids this issue even if you have a 10GB file. >> >> >> Thanks Aziz, >> That is what I was thinking since at the moment it is how you describe. Reading everything from the sheet first and then inserting. >> My task is now how to iterate each individually, however I have a dependent. PHPExcel. >> Not sure how to utilize this and read only one line at a time. >> I appreciate the corroboration. >> >> Best, >> >> Karl DeSaulniers >> Design Drumm >> http://designdrumm.com > > The prior developer had all the information being pushed into multiple arrays. > I tried reducing these arrays as I felt it was redundant. Not thinking of the memory size at that point, > but you would think in reducing the number of arrays storing the same info would reduce memory usage. > And it did, but still seems a beggar route when there is a better and more efficient way to do this. > > Best, > > Karl DeSaulniers > Design Drumm > http://designdrumm.com >