first thanks to all who have read ;-)
your solution looks like the method how i done it actually,
i have tested the last hours the solution with sql lite on application
server
the Solution:
3 Mysql Server ( 1 more to handle the big load ) (1 Master, 2 Slaves)
mysql replication
10 Applikation Server ( get today two more from my hoster :-) )
on Application Serer running php and sql lite
by the first load from mysql server the big query get synchronised with
the lokal sql lite and write into the database
entrys are about 6 hour valid, after then the server get the new list.
performance looks nice:
total load time: between 0.03-0.09 Seconds
but i found another problem by the time i worked on the server
application server can create images and thumbs of them in various sizes
(gd lib etc.)
then this server open a ftp connection ( ftp_connect().... ) to a global
data & storage server
the data server has just running ftp so i must created the thumbs on
application server and move all files to the data server:
question:
*can php handle some compression with ftp* ? so that i can move some
more data?
chirs
Michael A. Peters schrieb:
workerholic@xxxxxxxxxxxx wrote:
hi andrew i think you understand my problem a little,
but if 100 user load this query at the same time, the two mysql
server had a lot to do!
so i think to cache this query as xml to the application server local
make thinks faster,
but, i would like to have the same performance to read this xml
document as read the query from mysql server...
i dont know why php is so slow to read the xml file...
Are you saving to file or caching as a query result?
Also note that you can cache an array of rows (at least with APC but I
suspect memcache as well) - say my_fetch(key) is you function to fetch
from cache and my_store(key,data,life) is your function to store.
$result = my_fetch('big_query');
if (! $result) {
$sql = 'your query';
$rs = mysql_query($sql);
while ($row = mysql_fetch_object($rs)) {
$result[] = $row;
}
my_store('big_query',$result,21600);
}
No xml involved and you can loop through the results.
If you'd rather do it as xml, you can cache the xml as a string and
then fetch it, importing it into a DOM or whatever to extract your
results.
$xml = my_fetch('queryResultAsXML');
if (! $xml) {
generate xml and cache it
}
$dom = new DOMDocument('1.0','utf-8');
$dom->loadXML($xml);
Not sure what you are doing, apoligize if these suggestions are
useless or already considered.
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php