Wolf wrote:
Danny Brow wrote:
I have about 10 csv files I need to open to access data. It takes a lot
of time to search each file for the values I need. Would it be best to
just dump all the cvs files to an SQL db and then just grab what I need
from there? I'm starting to think it would make a lot of sense. What do
you guys think?
Thanks,
Dan
Dan,
I can tell you that depending on the size of your files is going to
dictate the route you want to go. I have a CSV with 568,000+ lines with
19 different pieces to each line. The files are around 180M apiece and
it takes my server about 2 seconds to run a system grep against the
files. I can run a recursive call 7 times against a MySQL database with
the same information and it takes it about 4 seconds.
IF you have system call ability, a grep wouldn't be bad, otherwise I'd
suggest loading the csv files into MySQL tables and checking them for
the information, then dropping the tables when you get the next files.
You can backup the databases such as a cron job overnight even.
HTH,
Wolf
If you do go the MySQL route, MySQL can import CSV files natively, and
it'll be a lot faster than doing it through PHP. Just look up the
syntax for the LOAD DATA INFILE command... or look here.
http://dev.mysql.com/doc/refman/5.1/en/load-data.html
Here's an example:
LOAD DATA LOCAL INFILE '/importfile.csv'
INTO TABLE test_table
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
(field1, filed2, field3);
I've had to do imports of a million or more records from CSV files, and
PHP is a lot slower than MySQL at importing them :)
--
Ray Hauge
www.primateapplications.com
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php