RE: Very Large text file parsing

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



> Paul Scott wrote:
> > I have a very large text file that gets dumped into a directoory every
> > now and then. It is typically around 750MB long, at least, and my
> > question is:
> >
> > What is the best method to parse this thing and insert the data into a
> > postgres db?
> >
> > I have tried using file(), fget*() and some others, all with limited
> > success. It goes through OK (I am sending it to a background process on
> > the server and using a callback function to email me when done) but it
> > is really knocking the machine hard, besides taking a real long time to
> > finish.
> >
> > Is there a better way of approaching this? Any help would be greatly
> > appreciated.
>
> First, which is your approach? I suspect that you are doing this with a
> cron job through php-cli.
>
> Now, to avoid using to many resources, try with fopen() and fgets().
> Also work with persistent connections, so you don't have that overhead.
>
> The problem with file() is that it will load all the file to memory, and
> you don't want 700+MB in memory.

In addition to Martin's good suggestions (and also assuming you're running
php-cli via cron), you could use nice to stop it consuming too many
resources:

http://en.wikipedia.org/wiki/Nice_%28Unix%29

Edward

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux