Re: wget replacement?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



At 08:42 4/1/2004, you wrote:
Does anybody know of a replacement for wget that will work as well as wget, but will not have the file size limit problem? wget can't get a file that is bigger than 2gigs in size. On the wget mail list, it is reported as a bug by some and as just a feature request by others. I am trying to mirror an ftp directory for a client so they can have a backup, but one file stops the wget download process. I can't find a way to exclude that one file from the wget download so now I have to see if there is another program out there that can work as well. Here is the command that I use. Yes, I have replaced the server IP with a fictious one. The actual IP is for an internet IP.

wget --passive-ftp --mirror --no-host-directories --cut-dirs=1 --directory-prefix=/home/SHARE1/ 'ftp://login:password@xxxxxxxxxxx/SHARE1/'

rsync (communicating over ssh) should be a perfect solution for you and provide better security and functionality than wget in this case. Something like:


# rsync -ave ssh user@remotehost:/path/to/files/* /local/path/

is the basic command. You can use exclude and include directives to finely tune what is or is not mirrored, and rsync will transfer only changed files. Simply an amazing program. Read the man page for more details, since it has *lots* of power and flexibility.

You could also look at curl as an alternate, but I am not very familiar with it.


-- Rodolfo J. Paiz rpaiz@xxxxxxxxxxxxxx http://www.simpaticus.com


-- redhat-list mailing list unsubscribe mailto:redhat-list-request@xxxxxxxxxx?subject=unsubscribe https://www.redhat.com/mailman/listinfo/redhat-list

[Index of Archives]     [CentOS]     [Kernel Development]     [PAM]     [Fedora Users]     [Red Hat Development]     [Big List of Linux Books]     [Linux Admin]     [Gimp]     [Asterisk PBX]     [Yosemite News]     [Red Hat Crash Utility]


  Powered by Linux