Performance question

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



This is more a "How would you do it" than a "How can i do it" question.

Didn't have time to try it, but i want to know how mysql_seek_row acts with large result sets.

For example im thinking of building a node tree application that can have dual direction links to nodes attached to different places.

I was wondering if i could actually have two result sets that query everything sorted by ID (Links and Nodes) then just seek the rows i need instead of dumping everything in php memory. When i mean large i mean really large, more than the standard possible 2 mbs of data allowed by most php servers.

That's where the "how you'd do it" comes into play. I think i'd just query my tables, loop them but keep only the line (to do a data_seek later on) and ID in some kind of an hash table or simply an array. This would make it relatively fast without taking too much memory.

This is my solution, how do you people see it without dumping everything to memory or by making recursive SQL calls (which will obviously slow everything down i'm pretty sure)

Mathieu Dumoulin
Programmer analyst in web solutions
mdumoulin@xxxxxxxxxxxxxxx

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux