Search squid archive

Re: decreased requests per second with big file size

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi Amos,


Got it. Will go through the session helpers & figure out how to do it.

Thanks for the help :)


Ambadas

On Tue, Oct 13, 2015 at 1:25 PM, Amos Jeffries <squid3@xxxxxxxxxxxxx> wrote:
On 12/10/2015 6:51 p.m., Ambadas H wrote:
> Hi Amos,
>
> Thanks for responding
>
> *"You would be better off taking the first use of any domain by a client,*
>
> *then ignoring other requests for it until there is some long period*
> *between two of them. The opposite of what session helpers do."*
>
> Could you please elaborate a little on the above logic.

That is about as clear as I can explain it sorry. Look at what the
session helpers do to determine whether two requests are part of the
same session or not.

You need to start with that *then* figure out how to split each sequence
of requests now grouped into "session" down into whatever grouping you
define "page" to be.


>
> My understanding, if not wrong, is to take domain/host of first client GET
> request & don't consider the same if it matches with the subsequent GET
> requests.
>
> In this case there is possibility of multiple unique domains/hosts for
> single page (Eg. other domain Ads, analytics etc)?

Yes. There is simply no concept of "page" in HTTP.

It is a hard problem to even figure out with any accuracy what requests
are coming from the same client.

Amos


_______________________________________________
squid-users mailing list
squid-users@xxxxxxxxxxxxxxxxxxxxx
http://lists.squid-cache.org/listinfo/squid-users

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux