Re: serve long duration pages

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Jochem Maas wrote:
John Gunther wrote:
This may be more appropriate for an Apache or browser forum. Perhaps you
can suggest an appropriate one.

I have a PHP script that takes a long time to complete (due to very
heavy database activity) although it echoes unbuffered output every few
seconds. I've set max_execution_time to 0 with set_time_limit(0) so PHP
doesn't time out the page but after about 3 minutes, Firefox says, "The
connection to the server was reset while the page was loading." and IE7
says, "Internet Explorer cannot display the web page". I want to try
apache_reset_timeout() but it's not implemented in 4.3.10.

Is there anything I can do from PHP or elsewhere to solve this?

yes.

lets start by saying that any webpage script that takes that long to run is
shit by design.

what I would suggest is that you run a script via cron that generates a static
page which you can then view whenever required; let the cron job run at what ever
interval is required/possible.

your database maybe be in bad shape - check you have indexes on all the fields
your searching/joining on (as a first step)

another alternative would be to break down the output of the script into
several pages.

sorry I don't have a real solution with regard to forcing the browser to
keep suckig on the connection. maybe someone else with more fu does.

John Gunther
Bucks vs Bytes Inc


You can run DB queries asynchronously and periodically send invisible HTML (IE, "<font></font>") to the browser every few seconds. But Jochem is right, there are exactly zero cases in which a web application should have page times measured in minutes. At that point, you're doing something wrong. Either your database is poorly designed and needs optimization, or you need to look into how you deliver data to the client. Even if a page is meant for internal use, if it takes longer than a second or two to load, I usually start looking into optimizing or another way of getting the data. I often spin big operations off into cron jobs so that the data or operation is already available by the time a user loads the page.

Looking into the database angle, even complex queries against databases with millions of rows should happen almost instantaneously with proper indexes and structure. I used to run a site that had about 15 thousand rows. A single table, very simple queries. And yet, those queries started taking several SECONDS to accomplish as the table grew. By the simple act of throwing in a few well placed indexes, times dropped from seconds to milliseconds, improving by orders of magnitude. I further sped things up using the MySQL query cache, and by doing all my SELECT queries against a HEAP table (a MySQL table stored entirely in memory), while updates and inserts went to disk and then were mirrored back into the HEAP table. There are so many ways to optimize databases, it's insane.

Regards, Adam.

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux