My approach in building the archive function is:
1) SELECT query on the data 2) mysql_fetch_array to put the data into an array 3) INSERT subqueries to put the data into the archive tables.
My concern is that in some cases, hundreds of rows of data would need to be moved - which could lead to awfully big arrays. However, the archiving function is likely to be used infrequently - not more than 1 or 2 times per week.
This leads to two questions:
1) Could such a big array cause performance problems or worse? 2) Is there a better way?
Many thanks,
Jeff
-- PHP Database Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php