On Thu, 2008-01-24 at 08:05 -0600, Les Mikesell wrote: > I think you are missing my point, which is that it would be a huge win > if yum automatically used typical existing caching proxies with no extra > setup on anyone's part, so that any number of people behind them would > get the cached packages without knowing about each other or that they > need to do something special to defeat the random URLs. HTTP doesn't define a way to do this, much like the Pragma header suggestion is a pretty bad abuse of HTTP ... suddenly the origin server going down means you can't get the data from the proxy. Please don't assume half of a solution, that never works well. What you _actually want_ is: "On my group my machines X, try not to download data more than once." ...at the ISP level this is solved by getting your mirror into mirror manager directly. At the personal level this is likely better solved by having something like "Have a zero-conf like service to share pacakges across the local network". There has even been some work to do this. In neither case is "work around HTTPs design in yum" a good solution, IMNSHO. -- James Antill <james.antill@xxxxxxxxxx> Red Hat
Attachment:
signature.asc
Description: This is a digitally signed message part
-- fedora-devel-list mailing list fedora-devel-list@xxxxxxxxxx https://www.redhat.com/mailman/listinfo/fedora-devel-list