On Fri, 2003-05-23 at 03:28, JeffParkinson@xxxxxxxxxxxxxx wrote: > Hi All, > > I am running the squid caching software and it was working fine, > but just 2 days ago we started having problems downloading webpages. > Typically, we would get most of the content and just be missing some > pictures, but at other times the whole page would not load. > > Clicking refresh multiple times would end up loading the page correctly. > Looking into the access and cache logs doesn't show any messages which > point us in the direction of why it won't load. > > We flushed the cache and the pages loaded better, but later it just > started > happening again, which leads me to think that I have a setting set wrong > or > in need of increasing. My cache size is set at 900MB, which should be > large > enough for the amount of users we have. is there enough space for the cache? I mean *more* than 900MB can you post a calamaris -p new -P 60 -s of the last access.log's ? also look for TCP_SWAPFAIL_MISS in the logs Piero -- Psyche-list mailing list Psyche-list@xxxxxxxxxx https://www.redhat.com/mailman/listinfo/psyche-list