[Yum] [UG] parallelizing downloading

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Wed, 2005-06-29 at 14:05 -0700, Michael Stenner wrote:
> On Wed, Jun 29, 2005 at 02:35:59PM -0600, Greg Knaddison wrote:
> > This came up in the BT/yum discussions and it got me thinking...
> > 
> > What are the thoughts on parallel downloading?  I resubjected this to
> > be a UG issue, but maybe it should be handled in yum and not UG.
> 
> It would definitely require changes in yum, and to be done most
> cleanly, would require changes urlgrabber.  Currently, all of the
> grabber, mirrorgroups, and keepalive code is threadsafe, so there are
> really two ways to approach it:
> 
>   1) threads
>   2) select loops
> 
> It turns out that neither of these are trivial.  They each have their
> strengths and weaknesses.  Ryan and I put some serious thought into
> this a while back but basically found that it's a non-trivial problem
> given all the other stuff urlgrabber does and then we got busy :)
> 
> I'm actually open to picking this issue up again if there's interest.
> We imagined a "batch grabber" which would be a grabber wrapper object
> much like the mirrorgroup stuff, but which would take a list of files
> (perhaps a queue for pipelining applications) and go to town on them.

One of the tricks I remember coming up before is sensibly representing
parallel downloads on a 80x25 display w/o resorting to something like
ncurses or snack.
Additionally, sensibly skipping a mirror or canceling a download in that
situation, as well.

While I agree parallel downloads could speed up certain items in yum I'm
concerned about how complex it could make the interface.

What do y'all think?
-sv




[Index of Archives]     [Fedora Users]     [Fedora Legacy List]     [Fedora Maintainers]     [Fedora Desktop]     [Fedora SELinux]     [Big List of Linux Books]     [Yosemite News]     [KDE Users]

  Powered by Linux