Search squid archive

RE: sometimes the users can´t visit any webpage

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Wed, 2 Sep 2009 16:33:49 -0500, "Jesus Angeles"
<jangeles@xxxxxxxxxxxxxxxxxxxxxxx> wrote:
> Hi, thanks for your interest
> 
> Well, today I had same problem, this is an extract to my cache.log. The
> problem happened about 15:30hrs, and the user reported me about 16:00
hrs,
> and I had to restart the "squid service". 
> 
> Any Idea?  What does it mean "httpReadReply Excess data from..."?

The server at www.paginasamarillas.com.pe is pushing more data into Squid
after the objects its supposed to be sending have supposedly finished. This
is a broken web server or a malicious attack. Not good either way.

> 
> 2009/09/02 06:32:19| storeDirWriteCleanLogs: Starting...
> 2009/09/02 06:32:19|     65536 entries written so far.
> 2009/09/02 06:32:19|    131072 entries written so far.
> 2009/09/02 06:32:19|   Finished.  Wrote 132412 entries.
> 2009/09/02 06:32:19|   Took 0.0 seconds (4109238.7 entries/sec).
> 2009/09/02 06:32:19| logfileRotate: /var/log/squid/store.log
> 2009/09/02 06:32:19| logfileRotate (stdio): /var/log/squid/store.log
> 2009/09/02 06:32:19| logfileRotate: /var/log/squid/access.log
> 2009/09/02 06:32:19| logfileRotate (stdio): /var/log/squid/access.log
> 2009/09/02 06:32:19| logfileRotate: /var/log/squid/access1.log
> 2009/09/02 06:32:19| logfileRotate (stdio): /var/log/squid/access1.log

> 2009/09/02 15:32:49| httpReadReply: Excess data from "GET
> http://www.paginasamarillas.com.pe/js/scriptTagHead.js.jsp";
> 2009/09/02 15:32:49| httpReadReply: Excess data from "GET
> http://www.paginasamarillas.com.pe/js/scriptHome.js.jsp";
> 2009/09/02 15:32:49| httpReadReply: Excess data from "GET
>
http://www.paginasamarillas.com.pe/searchBarLocality.do?stateId=&cityId=&sub
> urbId="

Server at www.paginasamarillas.com.pe told Squid it was sending objects of
size X then sent X + N bytes of data down the link.  A response splitting
attack is probably underway. Squid drops those connections.

> 2009/09/02 15:59:54| Preparing for shutdown after 337998 requests
> 2009/09/02 15:59:54| Waiting 30 seconds for active connections to finish

Someone shutdown Squid.

> 2009/09/02 15:59:54| FD 11 Closing HTTP connection
> 2009/09/02 16:00:25| Shutting down...
> 2009/09/02 16:00:25| FD 12 Closing ICP connection
> 2009/09/02 16:00:25| WARNING: Closing client 172.20.100.1 connection due
to
> lifetime timeout
> 2009/09/02 16:00:25|
> http://mail.google.com/mail/images/cleardot.gif?zx=g31q8sija2fo
> 2009/09/02 16:00:25| WARNING: Closing client 172.20.100.136 connection
due
> to lifetime timeout
> 2009/09/02 16:00:25| 	http://kh.google.com/geauth
> 2009/09/02 16:00:25| WARNING: Closing client 172.20.100.1 connection due
to
> lifetime timeout
> 2009/09/02 16:00:25|
>
http://toolbarqueries.clients.google.com/history/feeds/default/subscriptions
> /browser
> 2009/09/02 16:00:25| WARNING: Closing client 172.20.100.1 connection due
to
> lifetime timeout
> 2009/09/02 16:00:25|
> http://mail.google.com/mail/images/cleardot.gif?zx=8w46jyqzoqzz

Two clients had their 4 active connections closed on them.

<snip>
> 2009/09/02 16:00:25| Squid Cache (Version 2.7.STABLE3): Exiting normally.
> 2009/09/02 16:00:26| Starting Squid Cache version 2.7.STABLE3 for
> i386-debian-linux-gnu...
<snip>
> 
> 
> -----Mensaje original-----
> De: Jeff Pang [mailto:pangj@xxxxxxxx] 
> Enviado el: Lunes, 31 de Agosto de 2009 08:44 p.m.
> Para: squid-users
> Asunto: Re:  sometimes the users can´t visit any webpage
> 
> 2009/9/1 Jesus Angeles <jangeles@xxxxxxxxxxxxxxxxxxxxxxx>:
>> Hi all, I have a problem. Three weeks ago I installed Squid 2.7.STABLE3
+
>> Dansguardian 2.10.1.1 in GNU/Linux Ubuntu Server 9.04. First week was
ok,
>> but the service was started to fail, sometimes (once or twice for day )
> the
>> users can´t visit any webpage, the web browser shows a blank page (delay
> on
>> load), in those moment I check:
>> -       The squid service is running.
>> -       The dansguardian is ok, because if the users try visit a
> prohibited
>> web, It shows the access denied page.
>> -       The logfile  (access.log) is generating logs (I checked with
tail
>> -f).
>> -       The memory and HD space is ok (I have configured 256 MB in
> cache_mem
>> and 4096 MB in cache_dir)
>> Then, in those moments, I have to execute “/etc/init.d/squid reload” to
>> solve the problem.
>>
> 
> Have you checked cache.log for the special requests?
> Only the info on cache.log (or with debug level) is valuable.
> 
> Jeff.

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux