Search squid archive

Re: Squid questions

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Kishore Venkat wrote:
Hello everyone,

I have setup Squid 3.0 STABLE 9 for testing purposes and I have the
following questions:

1.  We are trying to take steps to prevent DOS attacks and are
considering the possibility of using squid to cache pages and reducing
the load on the origin servers.  The particular scenario we are
targeting is folks posting url containing member specific information
in the querystring (such as email address or memberid or coupon code)
on social networks to take advantage of promotional offerings (such as
coupons) and all of a sudden we getting a sudden burst of traffic to
our site - these would be either .jsp or .asp urls.  I have tried
using the following line in our squid config:

refresh_pattern -i \.asp$ 10080 90% 999999 ignore-no-cache
override-expire ignore-private


and from my testing it appears to cache them only if there is no "?"
in the url (even if you do NOT pass any url parameters, but have the
"?" in the url, it still does not cache them - even if the .asp
contains only html code).  From my understanding, there is no way to
cache .asp / .jsp pages with querystring parameters - could you please
confirm this?

no. The old config files (3.0 included) have the following:

  acl QUERY ...
  cache deny QUERY

To cache specific URL you define an ACL and set "cache allow ..." before the deny line.


I was wondering if there is way to cache these dynamic .asp pages?  We
do not want all the .asp pages to go thru the squid cache as a lot of
them are dependant on data in the database and if the values in the db
changes, the content served must change as well.  So, we could place
the pages that need to go thru Squid's cache is a folder called
"squid", and modify the above squid.conf line such that only those
.asp pages that are present in the "squid" folder go thru the squid
cache.

No need to be so tricky. Setting Cache-Control differently for each page is possible. And can limit the time items get saved in cache, down to 0 seconds.


If there are other ways of preventing DOS attacks for the above
mentioned scenario, please let me know.

All I can think of right now is:

* stay away from regex as much as possible. its slow.

* configure the cache_peer link with raw IP and either a dstdomain or cache_peer_domain. Cutting DNS load out of the circuit.

* extend object timeouts as long as reasonable.

* use ignore-refresh option to refresh_pattern. maybe others.


2.  The one conern that I have is the Squid server itself being prone
to Denial of Service due to sudden bursts in traffic.  Can someone
share their experience based on the implementation on your web site.

nothing can be completely DDoS secure But Squid has a much higher request-per-second capability than most generated pages allow a webserver to have. So its a good layer to rise the DDoS damage threshold.


3.  When using the squidclient for testing purposes, if I have a very
long url (something that is 205 characters long, for example), it
appears that the request to the original servers does NOT contain the
entire url (with all the parameters).  The squidclient command
(including the -h and -p options and the 205-length url) is 261
characters long.  I saw a bug to do with the length of the hostname,
but I believe that is in earlier version of Squid and NOT in squid 3
stable 9.  Is there a way to test really long urls?

telnet, wget, or any other web client software should also work.

squidclient should have a 8KB URL limit for any header though.


4.  If the disk space that we allocate is completely used, do we know
what algorithm Squid uses to cache request for new pages - such as
LRU?  And is this configurable?

yes. default is LRU.
http://www.squid-cache.org/Doc/config/cache_replacement_policy/

Amos
--
Please be using
  Current Stable Squid 2.7.STABLE5 or 3.0.STABLE10
  Current Beta Squid 3.1.0.3 or 3.0.STABLE11-RC1

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux