project i'm working with, his host died or something.
but, the phpMyadmin (on that host) can NOT do a backup of 200mb table without running out of memory or something weird.
so what is the best way to text dump the table?
i thought i would (pseudo-code):
open DB do while not EOF grab 1000 rows from DB write to a file on host www path filename = filename000 + 1 until EOF
user would then FTP down the files to his machine.
but i figured i could use the HEADER to send it to the user.
i grabbed this from the HEADER help section:
<?php $output_file = 'something.txt'; $content_len = 666;
@ob_end_clean();
@ini_set('zlib.output_compression', 'Off');
header('Pragma: public');
header('Content-Transfer-Encoding: none');
header('Content-Type: application/octetstream; name="' . $output_file . '"');
header('Content-Disposition: inline; filename="' . $output_file . '"');
header("Content-length: $content_len");
>?
is that the best way to send the txt file to the user?
thanks...
Try using mysqldump from the command line. If you need a PHP script to do this, use system() or `` or one of the other system call functions.
-- paperCrane <Justin Patrin>
-- PHP Database Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php