On Fri, 2006-01-06 at 16:07 -0400, Edinelson.Shimokawa@xxxxxxxxxxxx wrote: > Hello. > I am here suggest a modification to correct a bug. > There is a bug in the yum _mirror_try method in > /usr/lib/python2.4/site-packages/urlgrabber/mirror.py that works like that: > - the list of mirrors are stored in a array (class URLGrabber?) gr and then > when the socket doesnt conect to first mirror it removes that mirror from > the array and get the next one. But getting always the first one. > - after it passed over all the array, i. e., when a array of zero lenght is > achieved, it populate the array again. but the _next pointer is pointing to > a mirror that doesnt exists (the last position plus one). > This gives a error like this: > > Traceback (most recent call last): > File "/usr/bin/yum", line 15, in ? > yummain.main(sys.argv[1:]) > File "/usr/share/yum-cli/yummain.py", line 149, in main > base.doTransaction() > File "/usr/share/yum-cli/cli.py", line 592, in doTransaction > problems = self.downloadPkgs(downloadpkgs) > File "__init__.py", line 565, in downloadPkgs > File "repos.py", line 605, in get > File "/usr/lib/python2.4/site-packages/urlgrabber/mirror.py", line 414, > in > urlgrab > return self._mirror_try(func, url, kw) > File "/usr/lib/python2.4/site-packages/urlgrabber/mirror.py", line 392, > in > _mirror_try > mirrorchoice = self._get_mirror(gr) > File "/usr/lib/python2.4/site-packages/urlgrabber/mirror.py", line 290, > in > _get_mirror > return gr.mirrors[gr._next] > IndexError: list index out of range > > I have did this to solve but I dont know if is the best solution: > > > def _mirror_try(self, func, url, kw): > gr = GrabRequest() > gr.func = func > gr.url = url > gr.kw = dict(kw) > self._load_gr(gr) > gr._next = 0 #Added line > looks like the right ballpark - I'll check with the Urlgrabber maintainer to make sure he agrees. but thanks. > One more thing, I suggest a option of timeout of connection. Because many > linux servers are under a anti-virus appliance that make the real download, > scan and deliver the file. When a file is big size the conection timeout > occurs. There is a timeout argument in kwargs of URLGrabber. timeout is available per-repo in yum. just set timeout = some-number-in-seconds. it should work. thanks and sorry for the long delay on responding -sv