Re: CSV speed

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I should say that running 7 quueries against 200MB table with ~600 000 rows
for 4 seconds looks to me as if there was no indexes and very poor database
design (i mean that CVS file is dumped into database incorrectly). Or you'r
server is Pentium 100 :)

P.S. Having myself a 1.5GB database with some tables 300MB+, having 1070
queries/sec average for last month.

2008/3/11, Wolf <LoneWolf@xxxxxxxxx>:
>
> Danny Brow wrote:
> > I have about 10 csv files I need to open to access data. It takes a lot
> > of time to search each file for the values I need. Would it be best to
> > just dump all the cvs files to an SQL db and then just grab what I need
> > from there? I'm starting to think it would make a lot of sense. What do
> > you guys think?
> >
> > Thanks,
> > Dan
>
>
>
> Dan,
>
> I can tell you that depending on the size of your files is going to
> dictate the route you want to go.  I have a CSV with 568,000+ lines with
> 19 different pieces to each line.  The files are around 180M apiece and
> it takes my server about 2 seconds to run a system grep against the
> files.  I can run a recursive call 7 times against a MySQL database with
> the same information and it takes it about 4 seconds.
>
> IF you have system call ability, a grep wouldn't be bad, otherwise I'd
> suggest loading the csv files into MySQL tables and checking them for
> the information, then dropping the tables when you get the next files.
> You can backup the databases such as a cron job overnight even.
>
> HTH,
>
> Wolf
>
>
>
> --
> PHP General Mailing List (http://www.php.net/)
> To unsubscribe, visit: http://www.php.net/unsub.php
>
>

[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux