hmm, the infrastructure ist good, this is just this query....
so to solve my problem i could run mysql on the application server and
store just this table
and read the query from them, it could solve my problem litte, i hope so!
Daniel Brown schrieb:
On Fri, Jul 10, 2009 at 13:07,
workerholic@xxxxxxxxxxxx<workerholic@xxxxxxxxxxxx> wrote:
hi andrew i think you understand my problem a little,
but if 100 user load this query at the same time, the two mysql server had a
lot to do!
so i think to cache this query as xml to the application server local make
thinks faster,
but, i would like to have the same performance to read this xml document as
read the query from mysql server...
i dont know why php is so slow to read the xml file...
It will be slower to read a file than data from an SQL database by
sheer design --- regardless of whether it's XML, CSV, plain text, etc.
And MySQL is faster still because it's run as a server with it's own
processing engine, completely independent of the PHP engine and
spawned process. Other factors involved are disk seek time, memory
capabilities, et cetera, but the SQL-vs-file point is the biggest.
For PHP to locate something within the file, it must load the
entire file into memory or read it byte-by-byte, line-by-line, from an
exact offset (given explicitly). SQL databases such as MySQL work
similarly, but don't catalog all data in quite the same linear
fashion. Further, MySQL is capable of indexing, allowing it to return
the data far faster.
There's a time and a place for each, but it sounds as though what
you're attempting to do would not be best-served by caching it in an
XML sheet.
Also, something to keep in mind (with no offense intended by any
means): if you have two database servers (using replication) for
load-balancing and they - combined - cannot handle 100 simultaneous
connections and queries, you may want to re-evaluate your
infrastructure and architecture.
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php