Re: large csv text file import into postgresql via PHP

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



El Jue 25 Sep 2003 13:08, Dave [Hawk-Systems] escribió:
> we have a number of csv dumps that occasionally have to be used to update
> tables in a postgres database...
>
> normally this is done by uploading the file and whomever running a php
> based parser which opens the file into an array, does a split on the comma,
> then executes an insert or update to the database with the split array
> values.
>
> we have a new upload that involves a 143mb file, compared to previous
> upload sizes of 10k to 150k.  Is there a more efficient way to handle this
> rather than having PHP load the entire file into an array (which would have
> to be in memory during the operation, correct?).

I guess you are using file() to read the entire file. Try using fopen() and 
read line by line with fread().

-- 
Porqué usar una base de datos relacional cualquiera,
si podés usar PostgreSQL?
-----------------------------------------------------------------
Martín Marqués                  |        mmarques@unl.edu.ar
Programador, Administrador, DBA |       Centro de Telematica
                       Universidad Nacional
                            del Litoral
-----------------------------------------------------------------

-- 
PHP Database Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



[Index of Archives]     [PHP Home]     [PHP Users]     [Postgresql Discussion]     [Kernel Newbies]     [Postgresql]     [Yosemite News]

  Powered by Linux