Hi, this sounds huge, and cries for a sql version of the import. Are both databases the same? MySQL? I give u a draft for MySQL u export the data u have, then u got a textfile with 100000+ sql statments in the php script u open the file and iterate over it by line (carefull it could be also ";" in case its a Unix created file on a windows platform) line == one SQL insert in table bla bla... in the loop then just mysq_query with this line if the the someid is an unique index the insert will fail, so only those records are inserted beeing not already in the database. but I think as of the amount off records it doesn't sound like a every 10 minutes job, if it is a rara job, just do it with phpMyAdmin sorry not pulling out the code, but was a long day behind the keyboard, need some sleep ralph_deffke@xxxxxxxx "Devendra Jadhav" <devendra.in@xxxxxxxxx> wrote in message news:be4b00cf0908151815r1c7430d2j8a6cb0da1f10ac00@xxxxxxxxxxxxxxxxx > Hi, > > I have to import data from one database to another, I have to import around > 100000(1Lac) records. > First I need to check if the record is already imported or not and import > only those records which are not imported. > > Here is my logic > > $already_imported = get_already_imported_records(); > format of the $already_imported is $already_imported[someid] = 'imported'; > > Now i take all records from another db and iterating through it. > > if (!key_exists($already_imported[$new_id])){ > import_function($new_id) > }else{ > echo 'allready imported'.$already_imported[$new_id]; > } > > Now my script is importing same records for more than one time. I am not > able to get through this issue > > Is it because of the size of the records or something else...? > > Please suggest me some solution which is faster, safe and easy to code :D > > Thanks in advance > > -- > Devendra Jadhav > -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php