Re: Browser displays blank page, while request still being handled

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 




> I'm guessing it's just a timeout issue - "maximum execution time
> exceeded" type thing - check your php or apache error logs and see
> what that tells you.. if nothing shows up, turn log_errors on, restart
> apache and see what you get.
I don't think it's a timeout issue on the server-side, because I've
already tried setting MAX_EXECUTION_TIME and MAX_INPUT_TIME to extremly
high values, and got the same result (i.e. the blank page).
By setting them to extremly low values I was able to verify the
app/script/server 's behaviour when it did timeout, and that produced
the expected warning-message for MAX_EXECUTION_TIME exceeded.

A browser has a timeout too.. type about:config in firefox, look for
network.http.connect.timeout - timeout is 30 seconds and
network.http.request.timeout is 120 seconds... thats not a long time.
Browsers indeed have timeouts. But my firefox (v1.5.0.3) didn't have those settings defined, so I did it for him. But it doesn't seem to matter what numbers I used (2sec or 2000sec). I still get my blank page, even when the request hasn't finished yet.

I couldn't find anything in the logs, although log_errors is turned on,
and the logging levels set to log everything thats possible.
>
> Try adding some flush() calls to the script. That might get the
> browser to display some content.. Don't know whether it will work though.
That's something I haven't tried yet, so thanks for the tip.
>
> Copy has to wait until it's finished before telling you the results.
> If it's copying a 20M file, that takes a while.. If you're copying
> multiple 20M files, then hey.. there's you're problem. How many and
> how big are the files in this case?
We're talking about multiple copy operations.  The first one copies 230
files (approx. 40mb) and the second one copies about 2200 files
(195mb).  Most files are under 400kb a piece.

Time how long it takes you to manually copy that amount of data.
Tried the flush() calls in between different parts of the request, doesn't help :-( Although I don't get why you'd want me to time the copying, I did. In fact, I timed all operations... Deleting the old files takes about 3 seconds, copying the new data takes about 20 sec, database requests and generating XML takes another 10 seconds. All in all, the complete operation takes < 35sec, which should be fast enough to prevent the browser or server from timing out..

BTW, would a timeout explain why the browser wants to repost the request when I trie to view the source-code of the blank-page?

> You might be better off running a script to run through cron every 5
> minutes or so and doing it all for you.. then getting it to email you
> the results.
>
Running the script through cron wouldn't be a good idea.  The app we're
talking about is a kind of Content Management System for photographers,
so the publish-request should only be called when the user has finished
his modifications to the site and is ready to publish them to his website.

The request goes in to a "process" queue with the relevant details..
obviously this only gets logged right at the end. Cron job runs, finds
the details, does its work..
Like I explained before, I don't think a cron job can be used, because the web-site should only be published when the user wants it to.

If you're trying to do all of this in one step I think you're out of
luck and you might need to break up the processes.
What do you mean by breaking up the processes? Put them in different requests, so that one request calls another one?

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux