Search squid archive

Re: Performance Squid

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Quoting Washington Correia - Terra <washcpc@xxxxxxxxxxxx>:

Hi !! I´d like to know if somebody can suggest to me a link or tutorial about "performance of squid". I want to research/study about my Proxy performance and need some help and information to start. Frequentily i have received some feedbacks that the Internet is slowler and this users are using the Proxy to get access. Could anybody help me ?

Thanks !!



this is just my opinion but you should divide your research into two categories of wait time.

there is the wait time after a user types in www.domainname.com into their browser (or clicks a link) till the proxy decides if this web page is going to be shown from the cache, denied via acl rules, or forwarded on up the chain.

there is also the wait time between when squid passes the request on up stream till when the requested is fulfilled.

the only one you can fix with squid is the first one, and there has been some discussion on this list that squids performance seems to drop off if the acls are checking against LONG lists of web sites for deny purposes.

(Remember that the squid performance benchmark is based on the equipment you are running squid on, so faster or slower than before is relative.)

you may have linked two or three proxy servers together (I'm working on this) you may be able to improve your total proxy time by moving jobs between different proxy servers that are better suited to the task.

For example:

if you have a subscription to urlblacklist.com the lists are quite long, and quite extensive. while you can use squid to block against these lists dansquardian (for me at least) works much faster at longer lists like blocking porn inside a corporate net.

DG is not capable of using the time in its filter rules though, so one setup might be use squid to deny sports sites like espn.com during work hours and then deny porn all the time.

In my view the ideal way to do this is dansguardian dumping into squid and then squid opening out to the internet (or the cache)

also you might consider disabling the cache for all urls and clients. Its a personal thing really, but when you can configure apache to parse html docs as php scripts then any web page can have dynamic content and you cant really tell by using the document name extension.

for my use, I have decided to disable the cache for all clients. I dont want to maintain copies of whats already on the internet, im just trying to establish reasonable access limits for my corporate network.

its still a work in progress, and im not fully operational yet, I just figured Id give you a heads up on what ive seen so far.

Good luck



[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux