thanks for the ideas Sir.
I haven't touched DB this deep before. Basically I need to migrate the data of a DB into a new complete system made in Django. The system architecture created in Django has big difference in terms of tables and columns than the data from source DB. Source DB doesn't have timestamp data but it is not null column in destination DB and no default value. On Wed, May 9, 2018 at 10:54 AM, David G. Johnston <david.g.johnston@xxxxxxxxx> wrote:
I am trying to concatenate the value of column firstname and lastname from source DB to name column of destination DB.for row in cur_t:%s
cur_p.execute("""
INSERT INTO lib_author (
created, modified, last_name,
first_name, country,
school_id, name)
VALUES (current_timestamp, current_timestamp, %s, %s, %s,
(SELECT id FROM ed_school WHERE name='My Test School'),)
""", (row['lastname'], row['firstname'], '', (row['firstname'], row['lastname']) )Actually, what I would do looks nothing like that...I'd use psql to \copy the relevant information out of the source DB into a CSV fileI'd use psql to \copy the just-exported data into the target DB into a staging (temp/unlogged) tableI'd then write, still in the psql script connected to the target machine:INSERT INTO lib_authorSELECT ...FROM temp_table;DROP temp_table; (if unlogged, if its truly a temp it will drop when the session ends)A for-loop based migration should be a measure of last resort. SQL is a set-oriented language/system and you should design your processes to leverage that. Act on whole tables (or subsets - WHERE clauses - thereof) at a time and not individual records.You can access the same API via Python so you wouldn't have to use psql - but moving csv data in bulk between the servers and performing calculations in bulk is the way to go is this is going to be anything more than a one-time toy project and you'll never touch a DB again.My $0.02David J.