Re: Very Large text file parsing

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Paul Scott wrote:
I have a very large text file that gets dumped into a directoory every
now and then. It is typically around 750MB long, at least, and my
question is:

What is the best method to parse this thing and insert the data into a
postgres db?

I have tried using file(), fget*() and some others, all with limited
success. It goes through OK (I am sending it to a background process on
the server and using a callback function to email me when done) but it
is really knocking the machine hard, besides taking a real long time to
finish.

Is there a better way of approaching this? Any help would be greatly
appreciated.

First, which is your approach? I suspect that you are doing this with a cron job through php-cli.

Now, to avoid using to many resources, try with fopen() and fgets(). Also work with persistent connections, so you don't have that overhead.

The problem with file() is that it will load all the file to memory, and you don't want 700+MB in memory.

--
 21:50:04 up 2 days,  9:07,  0 users,  load average: 0.92, 0.37, 0.18
---------------------------------------------------------
Lic. Martín Marqués         |   SELECT 'mmarques' ||
Centro de Telemática        |       '@' || 'unl.edu.ar';
Universidad Nacional        |   DBA, Programador,
    del Litoral             |   Administrador
---------------------------------------------------------

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux