On Wed, 2007-10-24 at 21:07 +0000, Werner Schneider wrote: > Hi, I got a strange problem: Using php 4.4.x, I capture the whole output for a webpage into the output-buffer by using ob_start and ob_get_clean, because I got to make some replacements in the html-code before sending the page to the browser. > > This worked fine with a small page, but now I got a page for which the html-code is about 280 KB (not too big I think). But I get an server-error 500 for this script on my linux-based webhoster. > > I tried to run it on an local WAMP-installation - it worked without error. > I temporarly deleted some of the output - it worked without error. > I turned of the output-buffering - it worked without error. > > I printed memory_get_usage, but it never exceeded 2 MB (I load about 200 records from a database and create objects from it and then print the data of these 200 records). > On phpinfo(), memory_limit of my webhoster is 40M - so 2MB SHOULD be no problem. > I increased the memory-limit to 64M - it doesn't help. > > Any idea what I could do next? I got no access to the apache error-log, but the error is definitely connected to the output buffer and how much I try to store in it. > > Any help is welcome. Use some random URL parameter and detect it in your script. When detected enabled display errors: <?php if( isset( $_GET['knjdcrksjhfcsjkhfndkf'] ) ) { ini_set( 'display_errors', 1 ); } ?> Then see if you get any errors. Although, if you're seg faulting, you still won't see an error since the program just dies. Cheers, Rob. -- ........................................................... SwarmBuy.com - http://www.swarmbuy.com Leveraging the buying power of the masses! ........................................................... -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php