This is more a "How would you do it" than a "How can i do it" question.
Didn't have time to try it, but i want to know how mysql_seek_row
acts with large result sets.
For example im thinking of building a node tree application that can
have dual direction links to nodes attached to different places.
I was wondering if i could actually have two result sets that query
everything sorted by ID (Links and Nodes) then just seek the rows i
need instead of dumping everything in php memory. When i mean large
i mean really large, more than the standard possible 2 mbs of data
allowed by most php servers.
That's where the "how you'd do it" comes into play. I think i'd just
query my tables, loop them but keep only the line (to do a data_seek
later on) and ID in some kind of an hash table or simply an array.
This would make it relatively fast without taking too much memory.
This is my solution, how do you people see it without dumping
everything to memory or by making recursive SQL calls (which will
obviously slow everything down i'm pretty sure)
Mathieu Dumoulin
Mathieu:
I'm not sure what you're asking, but if it is to limit the amount of
data presented to a user from a search, you could use LIMIT.
The below uses LIMIT:
http://xn--ovg.com/mysql
If you want the code, just ask.
tedd
--
--------------------------------------------------------------------------------
http://sperling.com/
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php