Tiago Hori wrote: > I fairly new at this, so please bear with me. :) > > I am building this web app for a project I am working at where I need to > store and process large amounts of data. > > The data comes in comma delimited style. There are a bunch of headers that > I don't need and then the data. These files contain 96 times 96 entries Do you mean 96 records (rows) with 96 columns each, or do you mean 9216 records? What's the size of the file? > and > I need to parse each one of those. Right now I have something working that > takes about 5 minutes to parse the whole file. I doubt that *parsing* the file takes about 5 minutes. > Any tips on how to make this > run more efficiently would be greatly appreciated. Measure first where you're script spends time, before you optimize in the "wrong" place. A profiler such as Xdebug's or xhprof should come in handy. If you don't have a profiler available, I suggest you replace queryMysql($query) with true, and check how long it takes to process the file. If that is much faster than before (and I presume so), you'll have to optimize the insertion of the values into the database. -- Christoph M. Becker -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php