Processing Large Binary News Groups?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi All: I am a subscriber of giganews, which now keeps binaries 100 days. Seems that with some binary groups showing 1 million or more articles, none of the usenet tools like this large a group. trn says "out of memory, signal 1 1 bye" NewsLite gave me error messages and bombed out. You see, first I look through a group in trn--and just leave articles I want to decode. Now, that I can pipe yenc encoded messages through uudeview, I can once again use pine. Anyway, I get the idea that its more the total size than number of articles. Is their some better way to handle large groups, without just catching up--and starting from now?
Thanks so much in advance
Hart

_______________________________________________
Blinux-list mailing list
Blinux-list@xxxxxxxxxx
https://www.redhat.com/mailman/listinfo/blinux-list

[Index of Archives]     [Linux Speakup]     [Fedora]     [Linux Kernel]     [Yosemite News]     [Big List of Linux Books]