Search squid archive

Re: Ahead Caching

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Well, I'm starting to think that Amos is right when he says
pre-caching isn't so useful.

If you're saying you should go through the pages listed and run "wget
-p" well, that's useless, because those items were already cached.
That would be like "post" caching.

Also, recursion with wget can be limited. You'll notice that both
Flaviane and I used the '-l 1' switch. We both thought that it's not
worth going more then one page down from any already cached pages.

On Wed, Oct 6, 2010 at 12:27 PM, John Doe <jdmls@xxxxxxxxx> wrote:
>
> From: Isaac Witmer <isaaclw@xxxxxxxxx>
>
> > On Tue, Oct 5, 2010 at  11:51 AM, John Doe <jdmls@xxxxxxxxx> wrote:
> > > Why  recurse?
> > > If you take your list from the log files, you will get all  accessed files
> > > already... no?
> > How would you do it?
> > with wget, the only way of having it crawl through  websites, is to
> > recurse... isn't it?
>
> Why cache the possible hundred pages bellow an url if nobody went to see
> them...?
> If people only watch sites frontpages, you just need to pre-cache sites
> frontpages.
> If they go see some sub-pages, it will appear in the logs...
> So I think you only need the '-p' option.
>
> JD
>
>
>



[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux