[Yum] Re: Questions about headers ...

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



>
>
>>
>>    First, is there a way to "package" the headers, so that I don't have 
>>to wait almost an hour to do my first "yum check-update"? I could also 
>>save a little bandwidth if I could put the basic headers in place before 
>>an update ...
>>    
>>
>
>This has been suggested before, and of course you can do it by hand, but
>
    See, never ever throught I had and orginal throught ...

>bandwidth is bandwidth and once the headers have been created/compressed
>all you're really saving is the per-transfer overhead, not the bandwidth
>per se.
>
    But that is the case,
du -ch freshrpms/header* os/header* updates/header*
16K     freshrpms/header.info
1.9M    freshrpms/headers
88K     os/header.info
20K     os/headers
12K     updates/header.info
1.3M    updates/headers
3.3M    total

    Which does not seem like a lot, but currently takes a good half hour 
to transfer to a new yum install ...

>>    Another question, would it not be better to just download the 
>>headers for the packages install on a PC that one would be doing an 
>>update on?
>>    
>>
>
>This won't work.  yum checks across ALL packages for dependencies and
>conflicts.  This is fairly complex, as a package may need another
>package or be needed BY another package, all recursively, until all
>dependencies are resolved.  So it isn't possible to predict ahead of
>time which package headers are needed.  One reason that yum functions so
>fast is BECAUSE it has a local copy of all of the headers.
>
    I understand the idea behide, local copy. Currently, when I do and 
install, I only do the update using the basic repo's in the yum config, 
and once I have done the yum check-update and finished the updates ( 
about 20min to a few hours [ Took a whole day, when I did an yum 
check-update in Botswana, which is even more terrible than South Africa] 
) , do I then add all the extra repo's I use ...

>>    I ask these question, because bandwidth in South Africa is like 
>>water in Mexico ( please don't flame me, it just sound poetic ), 
>>expensive and hard to find quality ... if there are anyways to reduce 
>>bandwidth requirments I am all ears ... and I am sure that the mirrors 
>>and hosts for yum repo's would welcome any help as well ...
>>    
>>
>
>The best way is to create a local repository (usually by mirroring an
>existing repository).  This costs you a whole lot of bandwidth -- once
>-- to set up the repository, and then you use LOCAL bandwidth in your
>LAN to do all the updates and whatever.
>
    I do something like that, basicly I backup my /var/cache/yum 
directory and take it with me to do and install ... looking at setting 
up and local repo for my computers in my network, but have just not got 
there yet ...

>The rsync tool can be used to create the mirror, and then used
>periodically thereafter to refresh the mirror.  rsync is totally minimal
>in its use of bandwidth -- it can be set to only send files that have
>changed, to compress all files before sending them, to ignore files that
>you don't want.  So you can actually construct an rsync command that
>mirrors only SELECTED PARTS of a repository elsewhere initially, and
>then only updates those parts if and only if the files have been updated
>on the original repository you are mirroring.
>
    Will have to look into this ... will be kewl to have something solid 
to backup and keep ...

>I have a DSL link into my home, and DSL is slow as molasses (384 Kbps
>inbound, on a good day).
>
    ADSL kas just been introduced in South Africa, and only to selected 
aera's where the local telco believe that can charge large fee's and 
make their money back quickly ... as for the average company with enough 
money to get fix lines, they are normally 64Kbps lines, else everybody 
else is like to have modems, so your 384Kbps seems like heavan here ... 
sorry to get carried away ...

>  I have a bunch of hosts at home to yum install
>or yum update.  Disk, on the other hand, is absurdly cheap and
>plentiful.  So I mirror the repository(s) at Duke via rsync onto my home
>server.  The first time this was immensely painful -- it took something
>like a day to complete the mirror.  
>
    This is what I will end up doing, just got to find the time to 
understand everything that I will be using, so when something stops 
working, I at least know where to look ... as for a day to complete, I 
watch an installation in Botswana take a day just to download the 
headers ... not kewl ...

>However now when I rerun the rsync
>script it updates in a matter of seconds (if there is nothing new to
>download) to minutes (if a half-dozen rpm's have been updated in the
>meantime).
>
    See the ligth in this  and as I said, will get this going in the 
near furture for my computers on the network, but could procedure be 
looked at which one could almost packaged /var/cache/yum and put in 
place on a target computer, this would save quite a bit of bandwidth 
with a new installation ... also would saved backup disk space (CDR) 
with the option I aks about a little while ago ( clean oldpackages ) ...

Thanks for the quick responce ...
Mailed
Lee



[Index of Archives]     [Fedora Users]     [Fedora Legacy List]     [Fedora Maintainers]     [Fedora Desktop]     [Fedora SELinux]     [Big List of Linux Books]     [Yosemite News]     [KDE Users]

  Powered by Linux