Jon Westcot wrote:
Hi David, et al.:
Thanks for the comment. I removed the trailing semi-colon in the two
areas where it was being sent to mysql_query() and tried the code again.
I'm still getting the same basic problem -- it silently aborts somewhere
around 22,000 to 26,000 records being processed out of just under 30,000.
When I don't build the $insert_query string, I am able to read through the
CSV file completely.
What indexes are on this table?
When you do an insert, each one has to update the index as well as the
data, so maybe that's where all the time is being spent in the database
(doubt it's the problem but try dropping all of the indexes on the table).
Are you commenting out this whole section?
$insert_query = "INSERT INTO evall VALUES(";
for ($c=0; $c < $num; $c++) {
if($c > 0) {
$insert_query .= ",";
}
$insert_query .= '"' . $data[$c] . '"';
}
$insert_query .= ");";
Try
$insert_query = "INSERT INTO evall values ('" . implode('\'', $data) . "')";
so you're not doing a for loop when you don't need to.
Also as someone else suggested if this is a csv file you can use LOAD
DATA INFILE to directly import it instead of having to create a bunch of
insert queries.
See http://dev.mysql.com/doc/refman/5.1/en/load-data.html
Only do this if you are 100% sure of the data (ie it it sanitized in
some other step).
As odd as this sounds, should I put in some type of delay? Could the
system be thinking it's getting flooded by all of the inserts?
Doubt it.
--
Postgresql & php tutorials
http://www.designmagick.com/
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php