of course..i should have thought of doing a simple comparison for the ones that work against those that didnt. the results are odd... im getting 403's on all those broken links. TCP_MISS/200 for the ones that work, of course, and TCP_MISS/403 for the ones that dont. makes perfect sense except for the fact they're THERE, heh. this of course would explain why refreshing isnt helping, if the objects are marked 403'ed. thanks for the assistance. On 11/27/06, Mark Elsen <mark.elsen@xxxxxxxxx> wrote:
> before i get started, do note that bypassing the proxy, everything is fine. > > Ive used squid without a hitch for years now, but never on a > satellite. On sites with a lot of links or images (news sites, > hardware reviews, p*rn, hell any modern site) it will break a LOT of > the links to pictures/background images/etc. Basically after a week > of trying every directive ive ever used and then some, i cant make it > any better. putting the dns_servers 127.0.0.1 helped in a way: it > made the sites begin to load faster, but the broken links were still > there. I basically got nowhere faster with that one. nothing else > ive tried seems to phase it. im not sure if there's any OS tweaks > that should be done at this point or if im SOL. > > ANY help/person with experience with this would be greatly > appreciated. we're a lowly cold (and remote!) little air force base > that needs all we can get in our barracks! > > >... Check access.log, for those requests leading to 'lost' images e.d. Check cache.log for further error messages, if any. M.