I have this weird problem where if I go though squid, a page would stall until it hits the read timeout limit. This only happens on certain webservers. eg: read_timeout set to 120 seconds. 1. request page on browser 2. page load stalls 3. after 120 seconds, squid sends a fin, ack to http server 4. http server returns a HTTP 200 after receiving the fin,ack packet 5. browser gets the requested page. Sometimes, squid would load the page fine. Here's my squid build: Squid Cache: Version 2.7.STABLE9 configure options: '--enable-linux-netfilter' '--enable-follow-x-forwarded-for' '--enable-linux-tproxy' '--enable-epoll' '--enable-async-io' '--with-pthreads' '--enable-storeio=ufs,aufs,coss,diskd,null' '--enable-removal-policies=lru,heap' '--enable-snmp' '--with-maxfd=65536' 'CFLAGS=-march=pentium3 -O2 -fomit-frame-pointer -pipe' 'CPPFLAGS=-march=pentium3 -O2 -fomit-frame-pointer -pipe' You can download tcpdump and squid config from http://mina.lolipower.org/mike/squid-stall.zip If I access the page directly on the browser, the page stalling doesn't happen. any ideas on addressing this issue? thx mike