Real Killer App!

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I'm having a heck of a time trying to write a little web crawler for my intranet. I've got everything functionally working it seems like, but there is a very strange problem I can't nail down. If I put in an entry and start the crawler it goes great through the first loop. It gets the url, gets the page info, puts it in the database, and then parses all of the links out and puts them raw into the database. On the second loop it picks up all the new stuff and does the same thing. By the time the second loop is completed I'll have just over 300 items in the database. On the third loop is where the problem starts. Once it gets into the third loop, it starts to slow down a lot. Then, after a while, if I'm running from the command line, it'll just go to a command prompt. If I'm running in a browser, it returns a "document contains no data" error. This is with php 4.3.1 on a win2000 server. Haven't tried it on a linux box yet, but I'd rather run it on the windows server since it's bigger and has plenty of cpu, memory, and raid space. It's almost like the thing is getting confused when it starts to get more than 300 entries in the database. Any ideas out there as to what would cause this kind of problem?

Nick



--
PHP Database Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[Index of Archives]     [PHP Home]     [PHP Users]     [Postgresql Discussion]     [Kernel Newbies]     [Postgresql]     [Yosemite News]

  Powered by Linux