Search squid archive

Re: Ideas for better caching these popular urls

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hey Omid,

I found the service I wrote and packed it in a RPM at:
http://ngtech.co.il/repo/centos/7/x86_64/response-dumper-icap-1.0.0-1.el7.centos.x86_64.rpm

If you are using other OS let me know and I will try to package it for your OS.
Currently debian\ubuntu alien converts the RPM smoothly.

The dumps directory is at:
/var/response-dumper

But the cleanup and filtering ACL's are your job.
You can define which GET requests the service dump\log into the files.
Each individual file in this directory will be name in the next format:
<int epoc time>-<8 bytes uuid>-<md5(GET:full url)>

This format will allow multiple requests happen at the same time but have a different name but the URL hash is still the same so you can filter files by this.
To calculate the hash of a URL use:
$ echo -n "GET:http:/url-to-has.com/path?query=terms"|md5sum

In each and every file the full ICAP respmod details exits ie:
ICAP Request\r\n
HTTP Request \r\n
HTTP Response\r\n

By default cookies+authorization headers are censored from both request and response in the dump to avoid some privacy law issues.

Now the only missing feature is RedBot is to feed a single request and a single response to get a full analysis.

Let me know if it works OK for you(works here fine for a while now).

Eliezer

----
Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: eliezer@xxxxxxxxxxxx


-----Original Message-----
From: squid-users <squid-users-bounces@xxxxxxxxxxxxxxxxxxxxx> On Behalf Of Omid Kosari
Sent: Wednesday, April 11, 2018 12:32
To: squid-users@xxxxxxxxxxxxxxxxxxxxx
Subject: Re:  Ideas for better caching these popular urls

Eliezer Croitoru wrote
> You will need more then just the urls but also the response headers for
> these.
> I might be able to write an ICAP service that will log requests and
> response headers and it can assist Cache admins to improve their
> efficiency but this can take a while.

Hi Eliezer,

Nice idea. I am ready to test/help/share what you need in real production
environment. Please also do a general thing which includes other domains in
first post attachment. They worth a try .

Thanks




--
Sent from: http://squid-web-proxy-cache.1019090.n4.nabble.com/Squid-Users-f1019091.html
_______________________________________________
squid-users mailing list
squid-users@xxxxxxxxxxxxxxxxxxxxx
http://lists.squid-cache.org/listinfo/squid-users

_______________________________________________
squid-users mailing list
squid-users@xxxxxxxxxxxxxxxxxxxxx
http://lists.squid-cache.org/listinfo/squid-users




[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux