I am building a report application that generates some text files for download and when the download starts it stops prematurely. The file sizes are currently in the order of the mega bytes and when I try the script that generates and sends the file in a test server the process goes smoothly no matter the size of the file but as soon as I move the script to the production server dowloads cut at 300kb aprox. My current workarround is to gzip the files and that is giving me some extra time but the files are growing and sooner or later my workarround will become useless. I guess the download is stoped by some timeout and not because of the amount of kb downloaded because the size varies slightly. If that timeout exists it should be of apox. 5-10 seconds. I use this function to perform the upload $contenido is the content of the file and to that variable I assign the big chunk of output from the report, $nombre_archivo is the optional name for the file. I can paste more code but I think the problem is here. <?php function enviarArchivo($contenido, $nombre_archivo = "") { if($nombre_archivo == "") { $nombre_archivo = date("dmyHi").".csv"; } header("Content-Type: application/octet-stream"); header("Content-Disposition: attachment; filename=$nombre_archivo"); header("Content-Length: ".strlen($contenido)); echo $contenido; } ?> Thanks in advance Manuel -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php