can do this fine with small files.... But if I get above a 1000 rows it takes so long it time out.
PHP is slow, but not *that* slow, you have a problem somewhere !
I can upload a 10,000 row equivalent file using COPY from psql in 2 seconds,
so the time is down to the PHP processing (really all it does is send itto
pg_put_Line)
If you read the whole file in memory, the server will kick your script (I think the default limit is 8 megabytes or something)...
So, I'd advise reading the file line by line using fgets() (dunno how it is spelled in php), and just skip the first line, and pg_put_line() the rest. This way you just use memory for one line at a time. ALso you can echo (and flush) messages like 'XX lines inserted...' to the user while it crunches.
If you're really stuck, and have command execution privileges, why not system() a command line like "awk -blah your file | psql copy to your table", or even launch it as a background process ?
---------------------------(end of broadcast)--------------------------- TIP 2: you can get off all lists at once with the unregister command (send "unregister YourEmailAddressHere" to majordomo@xxxxxxxxxxxxxx)