Re: wget replacement?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Thu, 01 Apr 2004 09:16:53 -0600, Steve Buehler wrote:

> > > wget --passive-ftp --mirror --no-host-directories --cut-dirs=1
> > > --directory-prefix=/home/SHARE1/ 'ftp://login:password@xxxxxxxxxxx/SHARE1/'
> >
> >
> >How about "curl"
> >        curl - get a URL with FTP, TELNET, LDAP, GOPHER, DICT, FILE, HTTP
> >or HTTPS syntax.
> >
> >or ftp, or rsync.
> >
> >I must say however, that the 2 GB limit sounds like a compiled in OS or
> >user resource limit.  Can your user create a file >2GB on thesame file
> >system?  You may want to check that before going further.
> 
>  From what I understood about curl, it wouldn't do recursive through all of 
> the directories on the other server.  I understood that I would have to 
> list each and every file to do it.  Not sure where I missed that in the man 
> pages.

How about lftp or ftpcopy? Both support mirroring.




-- 
redhat-list mailing list
unsubscribe mailto:redhat-list-request@xxxxxxxxxx?subject=unsubscribe
https://www.redhat.com/mailman/listinfo/redhat-list

[Index of Archives]     [CentOS]     [Kernel Development]     [PAM]     [Fedora Users]     [Red Hat Development]     [Big List of Linux Books]     [Linux Admin]     [Gimp]     [Asterisk PBX]     [Yosemite News]     [Red Hat Crash Utility]


  Powered by Linux