I believe I have the right conditions specified, but I do plan to go and review all of that both in the app and in the server environment. As for the status of the code, I'm not sure yet. I need to make something from this, and I haven't quite figured out yet how people make money from open source without charging a fortune for support. I'd rather charge less up front and support it for free, but we'll see what happens.
Nick
Matthew Moldvan wrote:
Even if the system is working correctly the first couple times, it may go into an endless loop if you do not specify the right conditions, for any programming application ...
I am very curious about this project ... is it open source? If so, I'd be interested in taking a look at how you implemented it.
Thanks, Matthew Moldvan.
System Administrator, Trilogy International, Inc. http://www.trilogyintl.com/
-----Original Message----- From: Nicholas Fitzgerald [mailto:nick@axelis.com] Sent: Wednesday, March 12, 2003 7:58 AM To: php-db@lists.php.net Subject: Re: Re: Real Killer App!
Well, I'm not locking them out exactly, but for good reason. When a url is first submitted it goes into the database with a checksum value of 0 and a date of 0000-00-00. If the checksum is 0 the spider will process that url and update the record with the proper info. If the checksum is not 0, then it checks the date. If the date is passed the date for reindexing then it goes ahead and updates the record, it also checks against the checksum to see if the url has changed, in which case it updates.
It does look like it's going into an endless loop, but the strange thing is that it goes through the loop successfully a couple of times first. That's what's got me confused.
Nick
Nelson Goforth wrote:
Do you "lock out" the URLs that have already been indexed? I'm wondering if your system is going into an endless loop?