Search Postgresql Archives

Re: Importing *huge* mysql database into pgsql

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



In article <1173191066.416664.320470@xxxxxxxxxxxxxxxxxxxxxxxxxxxx>,
".ep" <erick.papa@xxxxxxxxx> writes:

> Hello,
> I would like to convert a mysql database with 5 million records and
> growing, to a pgsql database.

> All the stuff I have come across on the net has things like
> "mysqldump" and "psql -f", which sounds like I will be sitting forever
> getting this to work.

> Is there anything else?

If you really want to convert a *huge* MySQL database (and not your
tiny 5M record thingie), I'd suggest "mysqldump -T". This creates for
each table an .sql file containing just the DDL, and a .txt file
containing the data.

Then edit all .sql files:
* Fix type and index definitions etc.
* Append a "COPY thistbl FROM 'thispath/thistbl.txt';"

Then run all .sql files with psql, in an order dictated by foreign keys.



[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux