Search squid archive

Re: caching websites automatically

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



lorenor escreveu:
Hello,

I'm searching for a method to cache websites automatically with squid.
The goal is to give squid a list of URLs and the proxy will cache the
sites.
I know only one way to cache a site. A client have to make a request.
But is there another way without client interaction?

   No, squid has no mode do that automagically.

But .... with some linux clients, wget for example, you can easily do that !!!!!

cd /tmp/garbage

sites.txt        should contain URL of the sites you wanna to fetch

www.onesite.com
www.othersite.com
www.anything.com



export http_proxy=http://your.squid.ip.box:3128
wget -i sites.txt --mirror

that should fetch in mirror style (EVERYTHING) from the informed sites and save them under the directory you started wget. Depending on the amount of data, that could take a long time to run. You can probably erase everything after wget finishes, but it may be intelligent to keep the files and run mirror again some days after, which will make MUCH less traffic being generated.

and, in the end of the process, squid should have cached everything that is cacheable according to site configurations and your caching parameters as well.

squid has no automatic mode for doing that, but that can be easily done with wget.


--

	Atenciosamente / Sincerily,
	Leonardo Rodrigues
	Solutti Tecnologia
	http://www.solutti.com.br

	Minha armadilha de SPAM, NÃO mandem email
	gertrudes@xxxxxxxxxxxxxx
	My SPAMTRAP, do not email it






[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux