Thanks for all the responses. I have one more thought; Since my input data is split into about 200 files (3GB each), I could potentially spawn one load command for each file. What would be the maximum number of input connections Postgres can handle without bogging down? When I say 'input connection' I mean "psql -U postgres -d dbname -f one_of_many_sql_files". Thanks, Ben On 07/12/2009 12:59 PM, Greg Smith wrote: Ben Brehmer wrote:By "Loading data" I am implying: "psql -U postgres -d somedatabase -f sql_file.sql". The sql_file.sql contains table creates and insert statements. There are no indexes present nor created during the load.Your basic options here are to batch the INSERTs into bigger chunks, and/or to split your data file up so that it can be loaded by more than one process at a time. There's some comments and links to more guidance here at http://wiki.postgresql.org/wiki/Bulk_Loading_and_Restores |