Re: wget replacement?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Thu, 2004-04-01 at 22:42, Steve Buehler wrote:
> Does anybody know of a replacement for wget that will work as well as wget, 
> but will not have the file size limit problem?  wget can't get a file that 
> is bigger than 2gigs in size.  On the wget mail list, it is reported as a 
> bug by some and as just a feature request by others.  I am trying to mirror 
> an ftp directory for a client so they can have a backup, but one file stops 
> the wget download process.  I can't find a way to exclude that one file 
> from the wget download so now I have to see if there is another program out 
> there that can work as well.  Here is the command that I use.  Yes, I have 
> replaced the server IP with a fictious one.  The actual IP is for an 
> internet IP.
> 
> wget --passive-ftp --mirror --no-host-directories --cut-dirs=1 
> --directory-prefix=/home/SHARE1/ 'ftp://login:password@xxxxxxxxxxx/SHARE1/'

Just a thought...

rsync may be better for this type of work.


-- 
"An opinion is like an asshole - everybody has one."
    - Clint Eastwood as Harry Callahan, The Dead Pool - 1988.


-- 
redhat-list mailing list
unsubscribe mailto:redhat-list-request@xxxxxxxxxx?subject=unsubscribe
https://www.redhat.com/mailman/listinfo/redhat-list

[Index of Archives]     [CentOS]     [Kernel Development]     [PAM]     [Fedora Users]     [Red Hat Development]     [Big List of Linux Books]     [Linux Admin]     [Gimp]     [Asterisk PBX]     [Yosemite News]     [Red Hat Crash Utility]


  Powered by Linux