Search squid archive

Re: Squid Memory Leak with certain FTP requests?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 02/11/2015 11:10 AM, Yuri Voinov wrote:
> Squid first saves object in memory. Then swapout object to cache. As usual:
> This is no memory leaking, but normal cache behaviour. As documented.
> 
> You can play around with range_offset_limit and quick_abort_min parameters.
> 
> Or try to no cache this FTP with ACL.
> 
> Usually, when suggests memory leaking, this often OS issue. Not Squid.

Hello Yuri,

Thanks for your quick reply.
The ACL you suggested will probably solve the problem.

Nevertheless I'm not sure that this is working as intended as the memory
used by Squid is constantly increasing and not settling after a while.
How does range_offset_limit is relevant for FTP requests?
quick_abort_min does not make any sense for me as there are no aborted
requests in my test case.

It seems more that Squid generates the index.html when I do a wget
ftp://foo.bar/pub/ and does not free the cached object later on.

Just for the records, the memory footprint of the process serving the
FTP request is 10 times higher than the one of Squids processing normal
web traffic at a high request rate.

-- Matthias
_______________________________________________
squid-users mailing list
squid-users@xxxxxxxxxxxxxxxxxxxxx
http://lists.squid-cache.org/listinfo/squid-users





[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux