Hi,
I need to import some log-files of an application running on a different
host.
This app can't talk to the db but only creates daily a dump in a remote
directory that is mountable via samba by a Linux host that runs the
db-server.
The import would be easy if the files had a constant name but the app
creates csv files with names like "ExportYYYYMMDD".
I could have cron to use "find" to search for all the files in the
mounted directoy.
But how can I pipe a SQL script into the db-server that takes the
filenames from "find" as a parameter?
Because of the somewhat limited intelligence of the application that
creates the logs I have to read the contents of the log in a temporary
table and insert from there only those lines that aren't allready in the
actual log-table within the db.
I've got all covered but the filenames that change from day to day. :(
I could copy each of those files in a temp directory and import from
there so that the sql script wouldn't have to deal with date within the
file-name but I'd rather store the names in a table though so that the
script could skip all those files that allready got imported previously.
So how would I get the filenames into the sql-script?
--
Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general