> Yes! Yes! :) > By the way - i and one my friend make our oun cvs repository with yum code > and code some stuff like http reget support etc. I want to work like this: > we make part of code, test it and give to you a patches. If you agree with > this - get ready to realy _big_ patches :) I think I will tend to agree with linus, big patches suck. Can you isolate what they are patching? ie: a reget patch, a tabbing patch, a doc patch etc etc. > It is quite reasonable. What about make public CVS? > I think it's need not only for me :) public cvs is not entirely secure and it makes me edgy. > I don't like nor savannah nor sf.net cause it's to bloated. All what we need - just > cvs repository. All this callaboration tools only make you weak :) umm. sure. Well I kinda like some features that the sourceforge stuff offers but I'm not sure. I'll talk with some folks I know who are better at things like this than I. See what I can sort out that is also secure. > > SV> 5. I'd like to start a listing of publicly accessible yum repositories. > SV> - just so people can find stuff they have and add them to their > SV> yum.conf's > We are first! :) > [mastersite] > name=ASPLinux 7.2 Master Site > baseurl=http://download.asplinux.ru/i386/RPMS.7.2/ > [updates] > name=ASPLinux 7.2 Updates > baseurl=http://download.asplinux.ru/i386/updates/7.2/ now I have to figure out a way to list them sanely :) > SV> 6. list of features that have been requested: > SV> a. mirror support > SV> b. specify config file on command line > by URL! Please! yum -c 'http://dulug.duke.edu/yum/config' update yum > what can be better ?? :) hmm. I think in keeping with some of Icon's ideas we limit the url config-file locations for the moment. I need to talk to Icon more about what he has been thinking about, though. In principle I like the idea, though. > Yes! :) > I think what we need to add "weight" for each servers. > In short words - we don't need to download KDE10.0.2 from ftp.redhat.com if > we have this package on ftp.myoffice.smthg :) hmm. judging weight will not be fun. It will also require a shuffling of how nevral hands back paths. Then again much of this will require that so.... > also as i say - we make sort of abstraction for downloading. > It's very usefull cause original urllib url handlers to weak for serious > work. well that was sorta why urlgrab exists. I just used urllib b/c it was 1. easy 2. short and 3. did what I needed (which was http) I figure urlgrab could use urlparse and just split out to separate ftp, http, file, gopher, etc functions for each. > > Congratulations. > I hope what you have excelent vacation! :) Actually, I caught a cold and got sick but i wasn't at work, which is nice and I got to read some more. -sv