[Yum] I'm back and future stuff

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Date |27 Jun 2002 02:59:40 -0400
>From |seth vidal <skvidal@xxxxxxxxxxxx>

Hello!

>> Yes! Yes! :)
>> By the way - i and one my friend make our oun cvs repository with yum code
>> and code some stuff like http reget support etc. I want to work like this:
>> we make part of code, test it and give to you a patches. If you agree with
>> this - get ready to realy _big_ patches :)

SV> I think I will tend to agree with linus, big patches suck. Can you
SV> isolate what they are patching? ie: a reget patch, a tabbing patch, a
SV> doc patch etc etc.

Of cause i can :)

>> It is quite reasonable. What about make public CVS?
>> I think it's need not only for me :)
SV> public cvs is not entirely secure and it makes me edgy.

nightly builds ?


>> Yes! :)
>> I think what we need to add "weight" for each servers.
>> In short words - we don't need to download KDE10.0.2 from ftp.redhat.com if
>> we have this package on ftp.myoffice.smthg :)

SV> hmm. judging weight will not be fun. It will also require a shuffling of
SV> how nevral hands back paths. Then again much of this will require that
SV> so....

Nono! :)
You need only add to package structure a new property - weight,
then you download package - you selet package with same name but with 
greater weight.
I can produce patch if anybody except me need it.

>> also as i say - we make sort of abstraction for downloading.
>> It's very usefull cause original urllib url handlers to weak for serious
>> work.
SV> well that was sorta why urlgrab exists.
SV> I just used urllib b/c it was 1. easy 2. short and 3. did what I needed
SV> (which was http)
SV> I figure urlgrab could use urlparse and just split out to separate ftp,
SV> http, file, gopher, etc functions for each.

look:

def urlgrab(url, filename, grab_mode = reget_mode.SMART):
    try:
        _downloader = downloader(filename, url, grab_mode)
    except IOError, e:
        log(5, 'Error opening local file "%s" to download from "%s"' % (filename, url))
    
    if grab_mode != reget_mode.AGAIN:
        try:
            _f_size = _downloader.file_size()
            if _f_size > 0:
                log(5, 'Continue "%s" %s download from "%s"' % (filename, _f_size, url))
        except:
            pass
    else:
        log(5, 'Forced download "%s" from "%s"' % (filename, url))
    try:
        _downloader.open()
        _downloader.get()
        _downloader.close()
    except IOError, e:
        errorlog(0, 'IOError: %s, URL: %s'  % (e, url))
        sys.exit(1)
    return filename
-------------------------
do you understand what i mean ?
it's more abstractive and look and work much better for me.


>> 
>> Congratulations.
>> I hope what you have excelent vacation! :)

SV> Actually, I caught a cold and got sick but i wasn't at work, which is
SV> nice and I got to read some more.

In other words - bad mood and low spirits.
We need you ! Don't sick any more! :)
........................................................................
IRC: irc.openprojects.net #asplinux                      Grigory Bakunov 
EMAIL: black@xxxxxxxxxxx                           ASPLinux Support Team
ICQ: 51369901                                     http://www.asplinux.ru
-----BEGIN GEEK CODE BLOCK-----
GCS/MU d-(--) s:- a- C+++>++$ UBLAVSX+++$ P+ L++++$ E++$ W++ N+>- o? K?
w-- O- M V-(--) PS+ PE+ !Y PGP+>++++ t+ 5++ X+++ R+++ tv+>-- b+++ ?DI D+
G++ e>++$ h- r++ y+ z++(+++)
------END GEEK CODE BLOCK------


[Index of Archives]     [Fedora Users]     [Fedora Legacy List]     [Fedora Maintainers]     [Fedora Desktop]     [Fedora SELinux]     [Big List of Linux Books]     [Yosemite News]     [KDE Users]

  Powered by Linux