Rick Casey <caseyrick@xxxxxxxxx> writes: > So, I am wondering if there is any to optimize this process? I have been using Postgres for several years, but have never had to partition or optimize it for files > of this size until now. > Any comments or suggestions would be most welcomed from this excellent forum. The pgloader tool will import your data as batches of N lines, you get to say how many lines you want to consider in each transaction. Plus, you can have more than one python thread importing your big file, either sharing one writer and having the other threads doing the parsing and COPY, or having N independent threads doing the reading/parsing/COPY. http://pgloader.projects.postgresql.org/ Hope this helps, -- dim -- Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-general