On Mon, Sep 28, 2009 at 10:38:05AM -0500, Dave Huber wrote: > Hi, I'm fairly new to postgres and am having trouble finding what I'm > looking for. Is there a feature that allows bulk inserts into tables? > My setup is Win XPe 2002 SP3 and PostgreSQL 8.3. I need to add > entries from a file where each file contains 250 - 500 records. The > files are created by a third-party and read/written as binary. The > table looks like the following: <snip> > The current method for transferring records from the file to postgres is using a prepared statement that is called iteratively on each record read from the file: > > INSERT INTO data_log_20msec_table (timestamp_dbl,data) VALUES ($1::double precision,$2::bytea) > > Using COPY is out of the question as the file is not formatted for that and since other operations need to occur, the file needs to be read sequentially anyway. The usual approach is to use COPY FROM STDIN, then using pqputCopyData (or whatever it's called). That way you can perform any necessary munging and don't require the file to be on disk at all. Have a nice day, -- Martijn van Oosterhout <kleptog@xxxxxxxxx> http://svana.org/kleptog/ > Please line up in a tree and maintain the heap invariant while > boarding. Thank you for flying nlogn airlines.
Attachment:
signature.asc
Description: Digital signature