Casey Allen Shobe wrote:
I have a large amount of data in DB2 and Microsoft SQL databases that I would
like to copy to new tables in Postgres, primarily for testing and
proof-of-concept purposes.
I have a Linux/PHP installation supporting all three, and was wondering if PHP
would be a good medium for working this task.
I think that depends on how much data is "a large amount" and how much
patience you have ;-)
Seriously, I think you would be better off using MSSQL's BCP (bulk copy)
program to create a tab delimited file and then use PostgreSQL COPY to
import the data. The process would be something like:
- Use bcp to export data file
- Possibly use sed/awk/your-favorite-text-processing-program to adjust
the output (only *if* necessary) -- things to watch out for here are
delimiter characters (e.g. tabs) embedded in you data which need
escaping of some sort, and NULL values. I can't remember how much
control BCP gives you in representing nulls, but pgsql COPY allows it
to be specified.
- Create tables in pgsql; do not create indexes or foreign keys if
possible at this point.
- Import data using COPY
- Create indexes and refint constraints
- Vacuum analyze
This process should be orders of magnatude faster than querying the data
out, looping through row-by-row, building and executing insert
statements, etc. I suppose you could build the import file using PHP and
then use Postgres COPY, but it would still be more work and alot slower
than using vendor provided export utilities.
I'm not familiar with DB2, but I'd guess it comes with some sort of bulk
export utility also.
HTH,
Joe