Search Postgresql Archives

Re: Load data from a csv file without using COPY

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



just tell me the site , i dont have time to waste on shitty things , i will program a spammer to send email to this list

Bye


From: James Keener <jim@xxxxxxxxxxxxx>
Sent: Wednesday, June 20, 2018 3:16 AM
To: Asif Ali; pgsql-general@xxxxxxxxxxxxxxxxxxxx; Alban Hertroys; Ravi Krishna
Subject: Re: Load data from a csv file without using COPY
 
It's people like you who make spam filters worse for the rest of us to the point they need to be checked daily for false positives. I'm sure you could have found it in less time than it took to spam the list with obscenities.

On June 19, 2018 6:13:49 PM EDT, Asif Ali <asif2k@xxxxxxxxxxx> wrote:
please just tell me the site i will do it right away and i have marked it junked so many times , i will keep spamming it until my email address is removed from the list

Bye


From: James Keener <jim@xxxxxxxxxxxxx>
Sent: Wednesday, June 20, 2018 3:11 AM
To: pgsql-general@xxxxxxxxxxxxxxxxxxxx; Asif Ali; Alban Hertroys; Ravi Krishna
Cc: PG mailing List
Subject: Re: Load data from a csv file without using COPY
 
Seriously, stop spamming the list and stop cursing and acting like a petulant child. Go to the site and unsubscribe or use a mail client that understands the standard list headers.

On June 19, 2018 6:06:59 PM EDT, Asif Ali <asif2k@xxxxxxxxxxx> wrote:
how the fuck i unsubscribe to this mailing list , i get more than 100 emails a day

Bye


From: Alban Hertroys <haramrae@xxxxxxxxx>
Sent: Wednesday, June 20, 2018 2:10 AM
To: Ravi Krishna
Cc: PG mailing List
Subject: Re: Load data from a csv file without using COPY
 

> On 19 Jun 2018, at 22:16, Ravi Krishna <srkrishna@xxxxxxxxx> wrote:
>
> In order to test a real life scenario (and use it for benchmarking) I want to load large number of data from csv files. 
> The requirement is that the load should happen like an application writing to the database ( that is, no COPY command).
> Is there a tool which can do the job.  Basically parse the csv file and insert it to the database row by row.
>
> thanks

I think an easy approach would be to COPY the CSV files into a separate database using psql's \copy command and then pg_dump that as separate insert statements with pg_dump —inserts.

Alban Hertroys
--
If you can't see the forest for the trees,
cut the trees and you'll find there is no forest.



--
Sent from my Android device with K-9 Mail. Please excuse my brevity.

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]

  Powered by Linux