On 30/07/10 16:57, Scott Frankel wrote:
On Jul 30, 2010, at 1:13 AM, Richard Huxton wrote:
On 30/07/10 07:52, Scott Frankel wrote:
I have a number of very long strings that each contain many instances of
semi-colons, single quotes, forward and back slashes, etc. I'm looking
for an efficient and safe way to write them to my db using a prepared
statement.
What language? From "C"?
Importing an SQL script. eg: \i my_script_of_prepared_statements.sql
The full statement (below) illustrates the problem I'm encountering.
INSERT INTO foo (name, body) VALUES ('foo', 'this will fail 'fer sher;'
on the characters inside the string');
Ah - the solution is: don't do that.
You're going to have to pre-process the strings in some way, or there
will always be the chance of problems. Probably the best way to handle a
bulk insert is through the COPY command:
BEGIN;
COPY foo (name, body) FROM stdin;
n1 b1
n2 b2
foo this will fail 'fer sher;' on the characters inside the string
\.
COMMIT;
By default COPY expects one line per row, with columns separated by tab
characters. You can also have '/path/to/file/name' instead of stdin, but
the file will need to be accessible from the backend process. If that's
not the case (and it probably isn't) then you want to use psql's "\copy"
variant which views the world from the client end of things.
COPY is faster than separate inserts and the only characters you need to
worry about are tab, carriage-return and newline. These would be
replaced by the sequences "\t", "\r", "\n".
I don't know what format your strings are in initially, but a bit of
perl/python/ruby can easily tidy them up.
Finally, more recent versions of PG have a COPY that supports CSV
formatting too. See the manuals for more details on this.
--
Richard Huxton
Archonet Ltd
--
Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general