Search squid archive

Re: Cache settings per User Agent?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



> Hello,
>
> On Tue, Oct 7, 2008 at 8:11 PM, Henrik Nordstrom
> <henrik@xxxxxxxxxxxxxxxxxxx> wrote:
>> Best done by the origin server using the Vary header and Cache-Control:
>> max-age..
>>
>
> It can't, since it will confuse my squid to cache the page for normal
> user. Is should not be cached for normal request. Robot don't need the
> most updated result, don't need personalized contents etc.

Robot DO need the latest version of your page. How can people find your
new content if its not indexed on search engines?

True about personalized content, but then thats done by the web server
right? so it can send generic page with correct Vary, ETag, Cache-Control
for robot and all other unknown agents.

>
> I only want to cache, if and only if UA are robots. The squid will
> block the request and the robots will not hit my backend.
>

Any idea how many robots there are on the web? I've found it FAR better on
bandwidth and processing to have a generic default version of a page that
unknowns get. Personalizing only for knowns who can be personalized
accurately.

Amos



[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux