Re: web page problem

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Thanks, that works.  That shortens my script considerably.  Now I need
arrange to spider the website for the sha512 file and if that's not
available exit the script.  That should be a short operation.  The large
file takes time downloading and the original script I have runs the
integrity check on the file once finished.
I used the b and c options on wget to make log files and some code to
report on download progress in a more civilized fashion than wget and when
the log file was finished remove it and exit the loop.
while [ -f wget-log ]; do
sleep 30
wc -l wget-log && grep -i saved wget-log && rm wget-log
done
Maybe I can get wget --spider to put the urls in jenux.inp then download
those andrun basename on the sha512 file and pass that to sha512sum -c for
an integrity check once download is complete.
This is more interesting than I thought it would be before trying this
again.


On Sun, 23 Jan 2022, Linux for blind general discussion wrote:

> Hi
>
> Try something like
>
> wget --recursive --no-check-certificate -A 'Jenux-????.??.??-dual.iso'
> https://nashcentral.duckdns.org/projects/
>
> on one line. This will download only the .iso file but replicates the
> directory structure including hostname. Add the '--no-directories' to get
> files to the current directory.
>
>
> --no-check-certificate was included because site has expired certificate.
>
>
>
>

_______________________________________________
Blinux-list mailing list
Blinux-list@xxxxxxxxxx
https://listman.redhat.com/mailman/listinfo/blinux-list




[Index of Archives]     [Linux Speakup]     [Fedora]     [Linux Kernel]     [Yosemite News]     [Big List of Linux Books]