Search Postgresql Archives

re: constant crashing

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



The CSV files are being produced by another system, a WIndows app on a Windows machine. I then copy them to a USB key and copy them onto the ubuntu machine. The data is then imported via the COPY command. 

COPY master (field01,field02..fieldX) FROM '/data/file.text' DELIMITER E'\t'
The fields are tab delimited.

But importing the data works. I can get all the data into a single table without any problems. The issue is only when I start to update the single table. And that is why I started using smaller temporary tables for each CSV file, to do the updates in the smaller tables before I move them all to a single large table.

After all the data is loaded and updated, I run php programs on the large table to generate reports. All of which works well EXCEPT for performing the updates on the data. And I do not want to use perl or any outside tool. I want it all one in SQL because I am required to document all my steps so that someone else can take over, so everything needs to be as simple as possible.


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]

  Powered by Linux