On 4/14/24 13:18, jack wrote:
The CSV files are being produced by another system, a WIndows app on a
Windows machine. I then copy them to a USB key and copy them onto the
ubuntu machine. The data is then imported via the COPY command.
The app?
The locale in use on the Windows machine?
The locale in use in the database?
COPY master (field01,field02..fieldX) FROM '/data/file.text' DELIMITER E'\t'
The fields are tab delimited.
But importing the data works. I can get all the data into a single table
without any problems. The issue is only when I start to update the
single table. And that is why I started using smaller temporary tables
for each CSV file, to do the updates in the smaller tables before I move
them all to a single large table.
The import is just dumping the data in, my suspicion is the problem is
related to using string functions on the data.
After all the data is loaded and updated, I run php programs on the
large table to generate reports. All of which works well EXCEPT for
performing the updates on the data. And I do not want to use perl or any
outside tool. I want it all one in SQL because I am required to document
all my steps so that someone else can take over, so everything needs to
be as simple as possible.
--
Adrian Klaver
adrian.klaver@xxxxxxxxxxx