On 3/29/06, Ted Byers <r.ted.byers@xxxxxxxxxx> wrote: > May I ask a question about this? > > I will be working on an older database in which the original developer > stored XML documents as a single variable length text field. To process it, > it has to be retrieved in full and parsed. But the structure of it is heheh :) > simple in that it has an element for each field in the DB that replaced the > earily version. But people are still using the earlier one because they > still need access to the old data and no tool has yet been written by my > predecessors to move the old data over to the new DB. Does the XML support > you're all talking about make it less tedious to develop tools to parse > these old XML files and put their data into the right field of the right > table? I can develop the tool our users need using the resources I have at > hand, but the proces of parsing these XML files is certain to be very > tedious; something I am not looking forward to. There is a reason I don't > do much with XML even though I know how. Most high level languages these days have decent xml parsing power. My suggestion would be to parse it into insert statements the easiest way possible. Following that just stuff it into the new database. PostgreSQL string processing is powerful enough that you can create a view which presents the old structure if you want your legacy app to continue to access your database without substantion modication. Check out array_cat, array_to_string, etc. You could explore developing custom aggregates if necessary Merlin