Re: command line fan fiction program?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I believe I will try the uget idea first. it seems to require only one line of commands. As I do not run Linux directly, the most simplistic and efficient path works best for me. I got lost in all your lines of command there laughs.


On Thu, 23 Mar 2017, Tim Chase wrote:

On March 23, 2017, Karen Lewellen wrote:
An example of a mass downloader included with a  Linux shell?
I want to test this, but am unsure of what tool to use.

If, as Jeffrey suggests, there's a sensible pattern to the chapter
breakdowns (an actual sample URL would help), you can either use
"curl" which knows how to expand numeric ranges or wrap it in a "for"
loop in the shell.  Additionally, a little testing suggests that the
trailing story-title slug is optional.  So you can do:

 $ STORY_ID=8045114
 $ TOTAL_CHAPTERS=87
 $ SLUG=MauradersPlan
 $ curl
 "https://m.fanfiction.net/s/${STORY_ID}/[1-${TOTAL_CHAPTERS}]/"; -o
 "${SLUG}_#1.html"

I created variables to clarify what's going where and what's easy to
change.  It could easily be put in a script such as "fanfiction.sh":

 #!/bin/sh
 curl "https://m.fanfiction.net/s/$1/[1-$2]/"; -o "$3_#1.html"

make it executable:

 $ chmod ugo+x ./fanfiction.sh

and then you can invoke it with

 $ ./fanfiction.sh 8045114 87 MauradersPlan


Though now that I better understand the problem, wget might be an
even better solution since it can pull down all the chapters *and*
update the internal links so that they link to each other.  And all
you need is the story ID:

 wget -c --no-parent --mirror --trust-server-names --convert-links https://m.fanfiction.net/s/8045114/

Again, as per my previous message, you might want to use the

 --limit-rate=20.5k
 --random-wait

options as well to be a little kinder to the server instead of
hammering them.

You might then want to make a directory in which you put symlinks so
that you have a sensible name pointing to that directory so you can
remember that as "MauradersPlan" instead of "8045114":

 ln -s m.fanfiction.net/s/8045114 MauradersPlan

The wget and curl utilities are pretty prevalent, so I imagine
they're already available to you (and if not, should be uneventful
for installing).

I did the above, giving me an 87-chapter book that I was able to
navigate offline with lynx (or whichever other browser) and
encountered no major issues.

Best wishes,

-tim












_______________________________________________
Blinux-list mailing list
Blinux-list@xxxxxxxxxx
https://www.redhat.com/mailman/listinfo/blinux-list



_______________________________________________
Blinux-list mailing list
Blinux-list@xxxxxxxxxx
https://www.redhat.com/mailman/listinfo/blinux-list



[Index of Archives]     [Linux Speakup]     [Fedora]     [Linux Kernel]     [Yosemite News]     [Big List of Linux Books]