Re: large csv text file import into postgresql via PHP

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



oops forgot to mention reson for cron... Timeouts

processing 143mb can take a long time.....

also you need to use fgets to read the file one line at a time..

pete

Dave wrote:

we have a number of csv dumps that occasionally have to be used to update tables
in a postgres database...

normally this is done by uploading the file and whomever running a php based
parser which opens the file into an array, does a split on the comma, then
executes an insert or update to the database with the split array values.

we have a new upload that involves a 143mb file, compared to previous upload
sizes of 10k to 150k.  Is there a more efficient way to handle this rather than
having PHP load the entire file into an array (which would have to be in memory
during the operation, correct?).

thanks

Dave

-- PHP Database Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php


[Index of Archives]     [PHP Home]     [PHP Users]     [Postgresql Discussion]     [Kernel Newbies]     [Postgresql]     [Yosemite News]

  Powered by Linux