Sydney Puente wrote:
Hello,
It seems that very shortly I will have to extract some data from
ms-sql server and load it into postgres.
It seems that the ms-sql is getting bigger with increasingly more
users and performance is getting worse n worse.
And a local copy of data is required for and application that needs
pull lots of small bits of data back quickly complex queries tho,
currently embedded in views. And if the data is no older than 24 hours
that is OK.
Although I am pretty useful with sql on Oracle and mysql this is a bit
outside my comfort zone.
Could someone provide a few pointers on how to go about this and what
the issues might be?
MS SQL Server has a pretty useful Data Translation Engine aka DTE, which
can be used to export data on a scheduled and scripted batch basis to
external databases... You'd need to establish a postgres client with
OLEDB/ADODB or ODBC support on the SQL Server to do it this way.
Alternately, you'd run a script on the SQL Server (or any client that
has access to it) and export the data you want to CSV file(s), then
import these into a postgres database.
Where things might get messy.... if this data is large and growing as
you say, doing a full copy like this could be very time consuming on a
daily basis. Doing incremental replication between dissimilar
databases is not at all easy.
The first isssue that occurs to me is that CP1252 is used throughout
the data and there is a lot of european special characters, e acute
for example. But the column names etc are regular chars [a-zA-Z].
CP1252 aka Windows-1252 is actually pretty close to ISO-8859-1 aka
LATIN1. The differences are mostly that CP1252 uses the 80-9F section
for additional characters, this is unused in LATIN1.
Personally, I'd probably make the Postgres database UTF-8, then use
Windows-1252 as the client_encoding during the import process.
--
Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general