Re: Parsing a large file

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



> I have large log files from a web server (about a gig in size) and need
> to parse each line looking for a string, and when encountered push that
> line to a new file.  I was thinking I could have PHP read in the whole
> file, but thinking it could be a major pain since I have about 20 log
> files to read through.
> 
> Anyone have some suggestions?
> 
> Thanks,
> Robert

I'm actually in the process of doing the exact same thing!  If you search on
the list you'll see some of my emails.  But to help you out here's what I've
got so far. :)

Since you are dealing with such huge files you'll want to read them a little
at a time as to not to use too much system memory all at once.  The fgets()
reads a file line by line.  So you read a few lines and then process those
lines and then move on. :)

Hope this helps get you started!

// open log file for reading
if (!$fhandle = fopen($path.$log_file_name, "r")) {
    echo "couldn't open $file_name for writing!";
    die;
}

$i = 0;
$buf = "";
while (!feof($fhandle)) {
    $buf[] = fgets($fhandle);
    if ($i++ % 10 == 0) {
        // process buff here do all the regex and what not
        // and get the line for
        // the new text file to be loaded into the database
        // haven't written this yet

        // write to a file in the directory this runs in.
        // this file will be used to load data into a mysql
        // database to run queries on.

        // empty buff out to remove it from system memory
        unset($buf);
    }
}

fclose($fhandle);

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux