Donoso Gabilondo, Daniel wrote:
Hello,
I have an application in linux that uses http resources (videos,
images..). These resources are in other machine with a http server
running (under windows).
The linux application always download the resources. I installed and
configured squid in the linux machine to cache these resources, but the
linux application always downloads them from the http server. I don't
know how can I resolve the problem. I need some help, please.
I suspect you are trying to do some sort of web mashup involving Squid?
I've found the best ways to do those is to have squid as the public
domain gateway and do the app-linking/routing in the squid config.
Anyway on to your various problems....
The linux ip address is: 192.168.240.23 and the windows with http server
ip is: 192.168.233.158
This is my squid.conf file content:
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow localhost
http_access deny all
So none of the clients are allowed to make requests?
I'd expect to see a control saying the intercepted network has access
through.
acl localnet src 192.168.0.0/16
http_access deny !localnet
and drop the "deny all" down a bit....
icp_access allow all
allow all with no port configured? looks like you can kill this.
hierarchy_stoplist cgi-bin ?
access_log /var/log/squid/access.log squid
acl QUERY urlpath_regex cgi-bin \?
cache deny QUERY
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern . 0 20% 4320
acl apache rep_header Server ^Apache
broken_vary_encoding allow apache
coredump_dir /var/spool/squid
cache_dir ufs /var/spool/squid 700 32 512
http_port 3128 transparent
icp_port 0
cache_peer localhost.home.nl parent 8080 0 default
acl HOME dstdomain .home.nl
always_direct allow all
never_direct allow all
Those lines contradict each other 'everything MUST go direct + nothing
EVER allowed direct'.
You want just:
never_direct allow HOME
never_direct deny all
cache_peer_access localhost.home.nl allow HOME
cache_peer_access localhost.home.nl deny all
http_access allow HOME
.. the deny I mentioned dropping down goes about here. AFTER the peer
access config.
I executed these commands:
iptables -t nat -A PREROUTING -i eth0 -p tcp --dport 80 -j DNAT --to
192.168.240.23:3128
iptables -t nat -A PREROUTING -i eth0 -p tcp --dport 80 -j REDIRECT
--to-port 3128
Okay so far. What about intercepting the requests clients make directly
to your web app?
Since the app knows its running on port 8080 it will tell the clients
that in its URLs, and the 'clients' do not know about Squid they will
not ask for those objects over port 80.
The cache.log content is this:
2008/06/11 11:30:52| Starting Squid Cache version 2.6.STABLE19 for
i386-redhat-linux-gnu...
2008/06/11 11:30:52| Process ID 8617
2008/06/11 11:30:52| With 1024 file descriptors available
2008/06/11 11:30:52| Using epoll for the IO loop
2008/06/11 11:30:52| ipcacheAddEntryFromHosts: Bad IP address 'tele1'
2008/06/11 11:30:52| ipcacheAddEntryFromHosts: Bad IP address 'svc1'
Your hosts file has corrupt content.
Apart from all that, squid looks to be running fine.
Amos
--
Please use Squid 2.7.STABLE1 or 3.0.STABLE6