Besides being easy to schedule and very flexible, manipulating data with queries is extremely powerful and fairly easy to maintain assuming you know a little SQL -- thanks to postgresql's huge array of built in string manipulation functions. Your skills learned here will pay off using the database as well for other things. Not only that, but this approach will be fast since it is declarative and handles entire tables at once as opposed to DTS-ish solutions which tend to do processing record by record. Not to mention they are overcomplicated and tend to suck. (DTS does have the ability to read from any ODBC source which is nice...but that does not apply here).
Different strokes for different folks, it seems. I'd argue that COPY followed by a barrage of plpgsql statements can't be used for anything but the most trivial data migration cases (where it's invaluable) where you have line-organized data input for a hand-full of tables at most. In my experience (which is probably very different from anyone else's), most real world situations include data from a number of very different sources, ranging from the simplest (.csv and, arguably, .xml) to the relatively complex (a couple of proprietary databases, lots of tables, on-the fly row merging, splitting or generating primary keys, date format problems and general pseudo-structured, messed up information). Once you've got your data in your target database (say, pgsql), using SQL to manipulate the data makes sense, but it is only the _final_ step of an average, real world data transformation. Cheers, t.n.a.