Spread sheet is 2M, but 12M wasn't enough. this is the code getting data from csv: $fp = fopen ('/var/www/html/afan.com/admin/tmp/'.$SpreadSheetFile,"r"); while ($data = fgetcsv ($fp, 50000, ",")) { $num = count ($data); for ($c=0; $c<$num; $c++) { $PRODUCTS[$row][$c] = $data[$c]; } ++$row; } fclose ($fp); Then each row from array $PRODUCT checking if such a product no. exists and if no insert into DB, select_id(), insert into two other tables (categories and prices). > afan@xxxxxxxx wrote: >> Hi, >> I'm working on the script that has to read csv file (51 column an a >> little >> bit over 3000 rows) with products and store the info in DB. >> Script works fine while I tested on csv files up to 200 records. >> And, when I tried to upload REAL data I got this. >> PHP: Fatal error: Allowed memory size of 8388608 bytes exhausted ... >> >> On google I found as a solution to put >> ini_set("memory_limit","12M"); >> on the top of the script, except allowed memory size wasn't 8m then 12m >> bytes. >> I tried with 16M and it worked :) >> >> My question is how "far" I can go with increasing? Where is "the limit"? >> Why is not 24M as default in php.ini? >> >> Thanks! >> >> -afan >> >> > I think 16m should be enough in most cases.. > > Would like to see the script. Try to avoid reading large files in once > file() file_get_contents0. > Use fopen en fgets. I work with txt files over 40 m and hardly get 1 m > of memory usage. > > You could also try to run the program with the unix nice command. Just > don't think > increasing is a solution. What if you get a 300 m csv file? Well just my > 2 cents. > > Thijs > > -- > PHP General Mailing List (http://www.php.net/) > To unsubscribe, visit: http://www.php.net/unsub.php > > -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php