On 1/26/07, Tomi N/A <hefest@xxxxxxxxx> wrote:
2007/1/23, Paul Lambert <paul.lambert@xxxxxxxxxxxxxxxxxx>: > Is there an equivalent in Postgres to the DTS Packages available in M$ > SQL server.
what you're looking for exists in a number of variations. You can use a good text editor with the postgres' COPY command for simple bulk .csv loading, but in the long run, you might want to consider a
In my opinion, if your input data is in well-formed csv, you don't really need much of anything. Make a table(s) with all text columns which will accept the csv data from the copy statement. After that, write queries to insert...select data from your import tables into the actual tables holding the data doing all the appropriate casting in-query. Besides being easy to schedule and very flexible, manipulating data with queries is extremely powerful and fairly easy to maintain assuming you know a little SQL -- thanks to postgresql's huge array of built in string manipulation functions. Your skills learned here will pay off using the database as well for other things. Not only that, but this approach will be fast since it is declarative and handles entire tables at once as opposed to DTS-ish solutions which tend to do processing record by record. Not to mention they are overcomplicated and tend to suck. (DTS does have the ability to read from any ODBC source which is nice...but that does not apply here). In fact, my favorite use for DTS is to convert databases out of Microsoft SQL server and (ugh!) Access, a task which it excels at...but the real magic here is in the ODBC driver, not DTS. Worst case scenario is you have to do some preprocessing in C or perl on the csv document if it is not completely well formed and blows up postgresql's copy statement. In other words, you don't need a data processor, PostgreSQL *is* a data processor. merlin