Thank you for all the inputs.
Actually, I am reluctant to do the update line by line.
I plan to use a shell script to
. replace all characters such as ' to \'
. update each line to insert into
. call "-c query " load the file into db
In java, call this shell script, after data populated into tables, will
do other data comparison based on this table then.
You could write a script that would transform a .csv file into an INSERT
statement and save it to an .sql file.
Or I suppose you could do silly ODBC stuff with MS Access.
--
Brandon Aiken
CS/IT Systems Engineer
-----Original Message-----
From: pgsql-general-owner@xxxxxxxxxxxxxx
[mailto:pgsql-general-owner@xxxxxxxxxxxxxx] On Behalf Of Emi Lu
Sent: Tuesday, September 19, 2006 2:15 PM
To: PgSQL General
Subject: [GENERAL] Load a csv file into a pgsql table
Greetings,
*Except* copy command, are there other quick ways to load data from a
csv file into a pgsql table please?
Haven't seen the OP go by, but here's the one of the simplest csv
loaders ever created. No guarantees to suitability implied or
otherwise.
#!/usr/bin/php -q
<?php
$tablename = $argv[1];
$filename = $argv[2];
if ($argc!=3){
echo "Usage:\n\n loadpg tablename filename\n";
exit;
}
if (!file_exists($filename)){
die ("given filename doesn't exist\n");
}
print "copy $tablename from stdin;\n";
$fp = fopen($filename,"r");
while(!feof($fp)){
$line = fgetcsv($fp,4096);
if (strlen($line)==0) continue(1);
print implode("\t",$line);
print "\n";
}
print '\.';
print "\n";
?>
Note that you just redirect the output to psql and off you go.