On Tue, May 10, 2016 at 03:00:55PM +0200, Nicolas Paris wrote: > > The way I want is : > > csv -> binary -> postgresql > > > > Is this just to be quicker or are you going to add some business logic > > while converting CSV data? > > As you mentioned ETL, I assume the second, as I don't think that > > converting CSV to binary and then loading it to PostgreSQL will be more > > convenient than loading directly from CSV... as quicker as it can be, you > > have anyway to load data from CSV. > > > > Right, ETL process means huge business logic. > get the data (csv or other) -> transform it -> produce a binary -> copy > from binary from stdin > > Producing 100GO CSVs, is a waste of time. Ah. You need to fiddle with the data. Then you need to weigh the pros of something agnostic to Postgres's internals to something that needs to be aware of them. You will need to delve into the source code for data types more complex than INTEGER, TEXT and BYTEA (which was the majority of my data when I was just looking into it). -- "A search of his car uncovered pornography, a homemade sex aid, women's stockings and a Jack Russell terrier." - http://www.dailytelegraph.com.au/news/wacky/indeed/story-e6frev20-1111118083480 -- Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-general