Re: Re: Efficiently parsing a File

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi All,

Just wanted to say that using LOAD DATA INFILE worked perfectly, upload is
not instantaneous! Thank you. I had to fight appArmor for most of the
morning, but now I finally found a way to let mysql read into /var/www.

That made me wonder, and bear mind that I very new at this: on a production
server, is default for mysql to have access to the website basedir or do I
have to use something like LOAD LOCAL DATA INFILE?

Thanks!

T.





On Tue, Mar 18, 2014 at 10:07 PM, Tiago Hori <tiago.hori@xxxxxxxxx> wrote:

> Thanks everyone!
>
> I will try the temp file with LOAD DATA INFILE!
>
> T.
>
> Sent from my iPhone
>
> > On Mar 18, 2014, at 9:52 PM, Curtis Maurand <curtis@xxxxxxxxxxx> wrote:
> >
> >> On 3/18/2014 7:19 PM, Tiago Hori wrote:
> >> Jim and Christoph,
> >>
> >> Thanks!
> >>
> >>>> I am looking at all your suggestions. The file has 9216 entries with
> >>>> 10 columns, but I only need 2 of them really.
> >>>>
> >>>> The crucial one, is the one that I have to split with explode using
> >>>> the ": " separator.
> >>>>
> >>>> The files are about 1MB, that's why I reached out, I figured it
> >>>> shouldn't take that long.
> >>> I still presume the performance problem is due to the many separate
> >>> insert statements (reading such files with fgets() isn't the problem).
> >>> There is an article regarding the speed of insertions in MySQL
> >>> databases, with several optimization suggestions:
> >>>
> >>> <https://dev.mysql.com/doc/refman/5.7/en/insert-speed.html>
> >>>
> >>> HTH
> >> So, I could use Jim suggestion, but maybe not add the whole 9000
> entries of time, correct? Would it be a good solution to create separate
> arrays with every 500 rows or just create one big array like Jim suggested
> and then break the insertion into iterations of 100 rows?
> >>
> >> Thanks!
> >>
> >> T,
> > write out a temporary flat file the way you need it to look in the
> database delimited however you need it delimited.  Then use "LOAD DATA
> INFILE" to populate the table.  LOAD DATA INFILE is very fast and it's very
> efficient.  It'll load a 1MB file in a fraction of a second.
> >
> > delete the flat file when you're done.
> >
> > --Curtis
> >
> > --
> > PHP General Mailing List (http://www.php.net/)
> > To unsubscribe, visit: http://www.php.net/unsub.php
> >
>



-- 
"Education is not to be used to promote obscurantism." - Theodonius
Dobzhansky.

"Gracias a la vida que me ha dado tanto
Me ha dado el sonido y el abecedario
Con él, las palabras que pienso y declaro
Madre, amigo, hermano
Y luz alumbrando la ruta del alma del que estoy amando

Gracias a la vida que me ha dado tanto
Me ha dado la marcha de mis pies cansados
Con ellos anduve ciudades y charcos
Playas y desiertos, montañas y llanos
Y la casa tuya, tu calle y tu patio"

Violeta Parra - Gracias a la Vida

Tiago S. F. Hori. PhD.
Ocean Science Center-Memorial University of Newfoundland

[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux