Re: PHP mysql data result set compression

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



There's only one way to find out :)

David Yee wrote:
Thanks guys for clarifying the compression aspect.  Using
mysql_unbuffered_query w/ multipe conenctions sounds nice and simple- though
would this method mean more disk access than multiple limit queries? As far
as speed goes I imagine if I load as big of a dataset as possible into
physical memory w/o disk swapping then that would be the fastest way to do
this?

David

-----Original Message-----
From: Chris [mailto:dmagick@xxxxxxxxx]
Sent: Monday, February 06, 2006 4:50 PM
To: David Yee
Cc: 'php-general@xxxxxxxxxxxxx'
Subject: Re:  PHP mysql data result set compression


Hi David,

 From the comments on unbuffered_query:
However, when using different db connections, it all works ofcource ...

So create a second db connection and when you run the insert use that instead:

$result2 = mysql_query("insert blah", $dbconnection_two);


client-compress will compress the data on the way to php but then it has to be uncompressed etc (this won't affect much if you're doing it to a local mysql server though, it's more for network servers).


David Yee wrote:

Thanks guys- I think I'll have to do multiple queries using LIMIT as Geoff
suggested since apparently mysql_unbuffered_query() would lose the result
set of the "select * from" query once I run the insert query.  I'm still

not

sure why the MYSQL_CLIENT_COMPRESS didn't seem to have an effect, however.

David

-----Original Message-----
From: Chris [mailto:dmagick@xxxxxxxxx]
Sent: Monday, February 06, 2006 4:16 PM
To: David Yee
Cc: 'php-general@xxxxxxxxxxxxx'
Subject: Re:  PHP mysql data result set compression


Hi David,

See http://www.php.net/mysql_unbuffered_query

It won't load the whole lot into memory before returning it to php.

David Yee wrote:


Hi all- is there a way have a large data result set from MySQL compressed?
E.g. I have a table with over a million rows of data that I want to do a
"select * from " on and then take that result, do some field/data
manpulation, and then insert row-by-row to another table.  The problem is
the result of the query is so big that it's casuing PHP to swap to disk,
causing things to slow to a crawl.  Doing a "show processlist" on the

mysql


console shows that "Writing to net" is the state of the running "select *

from " query.  I tried adding the flag "MYSQL_CLIENT_COMPRESS" to both

mysql_pconnect() and mysql_connect() but it doesn't seem to do any
compression (I can tell by the size of the running php memory process).

Any


ideas would be appreciated- thanks.

David





--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux