Re: Very Large text file parsing

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Paul Scott wrote:
On Thu, 2007-09-20 at 09:54 -0300, Martin Marques wrote:
If not, you should just use the COPY command of PostgreSQL (you are using PostgreSQL if I remember correctly) or simply do a bash script using psql and the \copy command.


Unfortunately, this has to work on all supported RDBM's - so using
postgres or mysql specific functions are not really an option. What I am
trying though, is to add a function to do batch inserts as per Rob's
suggestion into our database abstraction layer, which may help things a
bit.

Both of these support importing csv files (use \copy as Martin mentioned for postgres). Mysql has this: http://dev.mysql.com/doc/refman/4.1/en/load-data.html

If you're supporting more than those two, can't really say whether others support this type of feature :)

Try batches:

begin;
... 5000 rows
commit;

and rinse/repeat. I know postgres will be a lot happier with that because otherwise it's doing a transaction per insert.


If you take the insert out of the equation (ie it runs through the file, parses it etc) is it fast? That'll tell you at least where the bottleneck is.

(Personally I'd use perl over php for processing files that large but that may not be an option).

--
Postgresql & php tutorials
http://www.designmagick.com/

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux