> On 30 Jul 2015, at 2:27, Sherrylyn Branchaw <sbranchaw@xxxxxxxxx> wrote: > > Based on your PS asking about data types and commenting that you don't want to put hour in a separate column, it sounds like this is a brand-new table you're creating. If so, and if this is a one-time COPY operation, you can create a text column for the initial import. Then after you're done importing, you can execute > > ALTER TABLE ts_test ALTER COLUMN ts_fld TYPE TIMESTAMP USING (to_timestamp(ts_fld, 'YYYYMMDDHH24')); > > to convert the format of the imported data to a timestamp. Then you're set. > > If there will be ongoing imports of more files like this, though, you'll need the intermediate table solution offered by Adrian. Or keep both columns and update those where the text-column is NOT NULL and the timestamp column is NULL. > I was going to suggest a trigger, but it turns out that the data type checking happens even before the BEFORE trigger fires, so you don't get a chance to massage your data before actually inserting it. I got 'ERROR: date/time field value out of range: "2015072913"' before the trigger even fired. I wonder if that's deliberate? I was able to implement a workaround by adding a raw_ts_fld column of type text, but an extra column might be too ugly for you relative to a temp table, I don't know. I was thinking that perhaps an updatable view might do the trick? You would need to create a view with the timestamp column converted to text in the format in your CSV file. Next you add an INSERT rule that does the conversion from text to timestamp and inserts the row in the actual table. Finally, you use the view in the COPY statement instead of the table. Added bonus, you can now also use the view to export your table to the same CSV format. Alban Hertroys -- If you can't see the forest for the trees, cut the trees and you'll find there is no forest. -- Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-general