On 4/24/20 9:12 AM, Steve Clark wrote:
On 04/24/2020 11:59 AM, Steve Crawford wrote:
On Fri, Apr 24, 2020 at 8:55 AM Steve Clark <steve.clark@xxxxxxxxxxxxx
<mailto:steve.clark@xxxxxxxxxxxxx>> wrote:
Hello,
I am using psql to copy data extracted from an InfluxDB in csv
format into postgresql.
I have a key field on the time field which I have defined as a
bigint since the time I get
from InfluxDB is an epoch time.
My question is does psql abort the copy if it hits a duplicate
key, or does it keep processing?
The copy will fail. You could import into a temporary table and
preprocess then copy to your permanent table or use an ETL solution to
remove unwanted data before importing. I don't know the nature of your
data or project but perhaps that column isn't suitable for a key.
Cheers,
Steve
I am attempting to periodically pull time series data from an InfluxDB.
The column at issue is the timestamp. I have a script that pulls the
last 15 minutes of data from the InfluxDB
as csv data and pipe it into a psql -c "\copy...." command. I was
looking for the simplest way to do this.
Then as suggested above pull into staging table that has no constraints
e.g. PK. Verify data and then push into permanent table.
--
Stephen Clark
*NetWolves Managed Services, LLC.*
Sr. Applications Architect
Phone: 813-579-3200
Fax: 813-882-0209
Email: steve.clark@xxxxxxxxxxxxx
http://www.netwolves.com
--
Adrian Klaver
adrian.klaver@xxxxxxxxxxx