Hello again, I've meshed the two ideas of one queue per repo and a per server max number of download threads and a global max number of download threads. The conf file now has, three new variables: - max_threads (8): the maximum number of download threads active at any instance on your local machine - threads_per_server (2): of those max_threads, only this many can be actively downloading from a specific server simultaneously. In a certain sense, this enables a "one queue per repo." Though really, you can have, - servers_per_repo (4): at most this many servers for a certain repo. Thus, with the default settings, if you want to just update from 'updates,' multithread will select 4 servers from that repo, queue them up, then only allow 2 downloads from each server up to the max_threads on a system. It is also worth noting that I changed the MultiThread helper class a little. Now, there is an add_package() function that takes the remote path to download and a local path to save it to. Once all the packages are added, a simple call to fetch_packages() will begin downloading them all. (I think this is closed to the urlgrabber functionality.) Any other comments? Thanks for what I've got so far! --michael -- Michael J. Schultz
Attachment:
pgpaijSA3yZj9.pgp
Description: PGP signature
_______________________________________________ Yum mailing list Yum@xxxxxxxxxxxxxxxxx http://lists.baseurl.org/mailman/listinfo/yum