Search squid archive

Re: Problem to access to a specific url (correo-gto.com.mx) with squid 2.7

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi Eliezer, thanks for your advise and response.

I will be more carefull next time, this is my first time at a forum like this.

I have my doubt to be a network issue, due that the proxy is behind a
fisical gateway, and I do not have control over it, it is another work
mate, and he says everything is correct on his side.

The thind is that if I use the proxy to go out to internet via another
modem, not the fisical gateway, the client can go out to
www.correo-gto.com.mx with out any problem, and when I conect it, the
way that has to be, is when I have the problem.

I want to get into the real problem, and see if the server has
something wrong. On the other side, I have installed a new serever
with squid 3.1, on the same level as the old proxy, the one with the
problem, the new proxy works fine, and the client can go in to the
page www.correo-gto.com.mx.

I follor your sugest to try wget and here are the results:

(1)
to the web page that I have the problem, it gaves me time out:

# wget http://www.correo-gto.com.mx/internacional/index.1.html
--2012-12-05 16:38:16--  http://www.correo-gto.com.mx/internacional/index.1.html
Resolving www.correo-gto.com.mx... 184.154.122.58
Connecting to www.correo-gto.com.mx|184.154.122.58|:80... connected.
HTTP request sent, awaiting response... Read error (Connection reset
by peer) in headers.
Retrying.

--2012-12-05 16:44:42--  (try: 3)
http://www.correo-gto.com.mx/internacional/index.1.html
Connecting to www.correo-gto.com.mx|184.154.122.58|:80... connected.
HTTP request sent, awaiting response...


(2)
to a diferent web page, it download succefully:

# wget http://curl.haxx.se/docs/manpage.html
--2012-12-05 16:30:23--  http://curl.haxx.se/docs/manpage.html
Resolving curl.haxx.se... 80.67.6.50, 2a00:1a28:1200:9::2
Connecting to curl.haxx.se|80.67.6.50|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 127127 (124K) [text/html]
Saving to: `manpage.html'

100%[======================================>] 127,127     --.-K/s   in 0.006s

2012-12-05 16:30:25 (20.9 MB/s) - `manpage.html' saved [127127/127127]

I been looking how to migrate from squid 2.7 to squid 3.1 but have no
found a way clear for me, I do not know if I just make a remove and
install will be enogth.

Thanks and great day.

On Wed, Dec 5, 2012 at 3:44 PM, Eliezer Croitoru <eliezer@xxxxxxxxxxxx> wrote:
> Hey omzatru,
>
> You indeed gave us a lot of info on config etc.
> The basic thing to check is a network issue.
> If the client is not able to access the site try to not use the proxy or use
> a forward mode which is not transparent.
>
> This can eliminate the issue from the network level to the application.
>
> Did you tried to use wget or curl from the squid machine to test
> connectivity?
>
> You are using a very old version of squid 2.7 is out of support for a very
> long time.
>
> I can however point you to that squid 2.7 dosn't support http/1.1 which
> might be the source to the problem.
>
> Also this server response can sometime be very slow maybe due to a reverse
> proxy on the way or other device.
>
> For the next time filter the squid.conf since the diff makes it unreadable.
>
> Kind Regards,
> Eliezer
>
>
> On 12/5/2012 11:23 PM, omzatru wrote:
>>
>> Hi I have a proxy server with Squid 2.7 installed, and I a problem
>> with a specific page.
>>
>>   www.correo-gto.com.mx
>>
>> A client can not access via proxy (squid 2.7) to this page.
>>
>> Accessing to diferents pages I do not have this problem, the
>> navigation via proxy works fine.
>>
>> I have adj the config file for the squid.
>>
>> I have the following logs:
>>
>> (1)
>> log in /var/log/squid/access.log:
>> ---------------------------------
>>
>> 1354318142.058 381547 10.0.12.51 TCP_MISS/502 1634 GET
>> http://www.correo-gto.com.mx/ - DIRECT/184.154.122.58 text/html
>> 1354318175.552 378090 10.0.12.51 TCP_MISS/502 1634 GET
>> http://www.correo-gto.com.mx/ - DIRECT/184.154.122.58 text/html
>> 1354318206.135 378088 10.0.12.51 TCP_MISS/502 1634 GET
>> http://www.correo-gto.com.mx/ - DIRECT/184.154.122.58 text/html
>>
>>
>> (2)
>> error in firefox accessing to www.correo-gto.com.mx
>> -----------------
>>
>> ERROR
>> The requested URL could not be retrieved
>> The following error was encountered while trying to retrieve the URL:
>> http://www.correo-gto.com.mx/
>> Read Error
>> The system returned: (104) Connection reset by peer
>> An error condition occurred while reading data from the network.
>> Please retry your request.
>> Your cache administrator is webmaster.
>> Generated Fri, 31 Aug 2012 21:36:31 GMT by webproxy (squid/2.7.STABLE7)
>>
>>
>>
>> (3.a)
>> Testin nslookup from the proxy server:
>> --------------------------------
>>
>> # nslookup correo-gto.com.mx
>> Server:         10.0.0.2
>> Address:        10.0.0.2#53
>>
>> Non-authoritative answer:
>> Name:   correo-gto.com.mx
>> Address: 184.154.122.58
>>
>>
>>
>> (4.a)
>> Making a tracepath to correo-gto.com.mx from  proxy server
>> ---------------------------------
>>
>> # tracepath correo-gto.com.mx
>>   1:  web.congresogto.gob.mx (10.0.0.8)                      0.200ms pmtu
>> 1500
>>   1:  10.0.0.253 (10.0.0.253)                                0.230ms
>>   1:  10.0.0.253 (10.0.0.253)                                0.183ms
>>   2:  no reply
>>   3:  no reply
>>   4:  no reply
>> ...
>> 30:  no reply
>> 31:  no reply
>>
>>
>> I have posted the problem in, but I have not had a contribution.
>>
>>
>> http://www.linuxquestions.org/questions/showthread.php?p=4771659#post4771659
>>
>>
>> I will appreciate a lot, if you can help me on this, I been looking
>> throw a solution, but I have not succeed.
>>
>> On the other hand, I have configure a new proxy test with squid 3.1,
>> and works fine, I can reach to the page correo-gto.com.mx with out any
>> problem.
>>
>> Thanks and have a great day.
>>
>>
>> squid.conf file, here are the changes that I have made:
>>
>>

>>   # should be allowed
>>   acl localnet src 10.0.0.0/8    # RFC1918 possible internal network
>>   acl localnet src 172.16.0.0/12 # RFC1918 possible internal network
>>   acl localnet src 192.168.1.0/24        # RFC1918 possible internal
>> network
>>   #
>>   acl SSL_ports port 443         # https
>>   acl SSL_ports port 563         # snews

>>   acl Safe_ports port 873                # rsync
>>   acl Safe_ports port 901                # SWAT
>>   acl Safe_ports port 3201       # SAP
>>   acl Safe_ports port 82         # isseg

>>   acl purge method PURGE
>>   acl CONNECT method CONNECT
>>
>>   # Lista de pAginas denegadas
>>   acl pages_deny url_regex "/etc/squid/pagesDeny.acl"
>>   acl pages_acces url_regex "/etc/squid/pagesAcces.acl"


>> @@ -673,7 +685,7 @@ http_access deny CONNECT !SSL_ports
>>   # Example rule allowing access from your local networks.
>>   # Adapt localnet in the ACL section to list your (internal) IP networks
>>   # from where browsing should be allowed
>>   http_access allow localnet
>>   http_access allow localhost
>>
>>   # And finally deny all other access to this proxy

>>   #      visible on the internal address.
>>   #
>>   # Squid normally listens to port 3128
>>   http_port 3128 transparent
>>
>>   #  TAG: https_port
>>   # Note: This option is only available if Squid is rebuilt with the

>>   #      objects.
>>   #
>>   #Default:
>>   cache_mem 1024 MB
>>
>>   #  TAG: maximum_object_size_in_memory  (bytes)
>>   #      Objects greater than this size will not be attempted to kept in

>>   #      enough to keep larger objects from hoarding cache_mem.
>>   #
>>   #Default:
>>  maximum_object_size_in_memory 512 KB
>>
>>   #  TAG: memory_replacement_policy
>>   #      The memory replacement policy parameter determines which

>>   #      (hard coded at 1 MB).
>>   #
>>   #Default:
>>   cache_dir ufs /var/spool/squid 6144 14 256
>>
>>   #  TAG: store_dir_select_algorithm
>>   #      Set this to 'round-robin' as an alternative.

>>   #      proper proxy for APT.
>>   #
>>   #Default:
>>  maximum_object_size 10240 MB
>>
>>   #  TAG: cache_swap_low (percent, 0-100)
>>   #  TAG: cache_swap_high        (percent, 0-100)

>>   #      numbers closer together.
>>   #
>>   #Default:
>>   cache_swap_low 90
>>   cache_swap_high 95
>>
>>   #  TAG: update_headers on|off
>>   #      By default Squid updates stored HTTP headers when receiving

>>   #
>>   #Default:
>>   negative_ttl 0 seconds
>>
>>   #  TAG: positive_dns_ttl       time-units
>>   #      Upper limit on how long Squid will cache positive DNS responses.

>>   #
>>   #Default:
>>   request_header_max_size 64 KB
>>
>>   #  TAG: reply_header_max_size  (KB)
>>   #      This specifies the maximum size for HTTP headers in a reply.

>>   #
>>   #Default:
>>   reply_header_max_size 64 KB
>>
>>   #  TAG: request_body_max_size  (KB)
>>   #      This specifies the maximum size for an HTTP request body.

>>   #Default:
>>   half_closed_clients off
>>
>>   #  TAG: pconn_timeout
>>   #      Timeout for idle persistent connections to servers and other

>>   #
>>   #Default:
>>   httpd_accel_no_pmtu_disc on
>>
>>   # DELAY POOL PARAMETERS
>>   #
>> -----------------------------------------------------------------------------

>>   #
>>   #Default:
>>   persistent_connection_after_error on
>>
>>   #  TAG: detect_broken_pconn
>>   #      Some servers have been found to incorrectly signal the use

>>   #
>>   #Default:
>>   icp_port 0
>>
>>   #  TAG: htcp_port
>>   #      The port number where Squid sends and receives HTCP queries to

>>   #
>>   #Default:
>>   check_hostnames off
>>
>>   #  TAG: allow_underscore
>>   #      Underscore characters is not strictly allowed in Internet
>> hostnames

>>   #
>>   #Default:
>>   balance_on_multiple_ip off
>>
>>   #  TAG: pipeline_prefetch
>>   #      To boost the performance of pipelined requests to closer
>>
>
> --
> Eliezer Croitoru
> https://www1.ngtech.co.il
> sip:ngtech@xxxxxxxxxxxx
> IT consulting for Nonprofit organizations
> eliezer <at> ngtech.co.il



-- 
Diego


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux