Tiago Hori wrote: > So, I could use Jim suggestion, but maybe not add the whole 9000 > entries of time, correct? Would it be a good solution to create > separate arrays with every 500 rows or just create one big array like > Jim suggested and then break the insertion into iterations of 100 > rows? I suggest to split the whole process in two steps. The first step reads the file and builds an array of records. The second step commits these records to the database. If you're doing it this way, you can easily change the actual insertion according to what is fastest (what might change with new versions of MySQL and PHP); even using LOAD DATA INFILE instead of INSERT could be easily tested, by storing the array of records in a temporary file between the two steps. -- Christoph M. Becker -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php