web page problem

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Is it possible when a file had its version updated on its web page to get
the urls of the new versions so the new versions can be downloaded?  html
doesn't support wild cards so this can't be done with wget.  I'd like to
be able to do this with a script if at all possible.  I know perl does
wildcards well, but don't know if perl can handle a job like this.
If a file is on a web page it can be checked using wget with --spider
option followed by the url name.
Since such a case will return a 0 errorlevel, it's possible to put a &&
followed by a wget -bc url to download the file if it exists.
once the file is downloading
wc -l wget-log && grep -i saved wget-log && rm wget-log
command run every so often shows the growing size of wget-log and at the
end will show the file name and then remove wget-log  The magic is in that
&& pipe from one command to the next.

_______________________________________________
Blinux-list mailing list
Blinux-list@xxxxxxxxxx
https://listman.redhat.com/mailman/listinfo/blinux-list




[Index of Archives]     [Linux Speakup]     [Fedora]     [Linux Kernel]     [Yosemite News]     [Big List of Linux Books]