Re: multiple data to insert to database

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Richard Lynch wrote:
On Fri, August 10, 2007 8:36 am, Alain Roger wrote:

I would like to know what is the best way and fastest to insert into
PostgreSQL around 25.000 records (extracted from CSV file).
Should i use the standard pg_exec($dbconn, "insert into..."); for each
record ?


25.000 records is not THAT large, so try it and see if it's acceptable
performance.

If not, ask the PostgreSQL list/forum how to bulk load a CSV file, as
it's not a PHP question, it's a PostgreSQL question.


This question just came up in the pgsql-general list last week, actually.

You'll want to use the COPY command:

COPY your_table FROM STDIN;
http://www.postgresql.org/docs/techdocs.15

But, as you're using PHP, you should look into the pg_copy_from() function:

http://www.php.net/manual/en/function.pg-copy-from.php

In a nutshell, pass it an array holding each line from your CSV file. That is, each value in the array is a single line from your CSV file. Note that you can specify the field delimiter, which defaults to a tab.

Take special note of how NULLs are represented. If there's any chance that your CSV file has empty fields you might want to replace them with '\N' (which is the default that the function is expecting).

brian

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux