Search squid archive

Re: invalid url in curl

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Tue, 09 Aug 2011 19:57:47 +0430, Mohsen Pahlevanzadeh wrote:
On Wed, 2011-08-10 at 01:20 +1200, Amos Jeffries wrote:
On 10/08/11 00:31, Mohsen Pahlevanzadeh wrote:
> Dear all,
>
> I recall when use squidclient "url" and our url is not valid, now i
> tested with telnet google, and save html file too google.html and
> changed a bit its header for cacheable:
> --------------
> curl -H "HTTP/1.1 200 OK" -H "Date: Tue, 09 Aug 2011 12:12:54 GMT" -H > "Expires: Thu, 08 Sep 2011 12:12:54 GMT" -H "Cache-Control: public, > max-age=29000" -H "Location: http://www.google.com/"; -H "Content-Type: > text/html; charset=ISO-8859-1" -H "Server: gws" -H "X-XSS-Protection: > 1; mode=block" -H "X-Cache: MISS from debian" -H "Transfer-Encoding:
> chunked" -d @./files/Google.com/Google.html  localhost:3128
> ---------------
> But i receive the bad url err.i don't know how to put url.url does
> initialize with "Location" header ? if yes squid doesn't get error.
>
> again i'm reading rfc, but i think it associate to acceptance of squid.
> How i hand off file with curl to squid?

curl is a client software. Just like a browser. It _receives_ files from squid. It does not send. Only web servers and proxies send page objects
in HTTP.

  curl --proxy 127.0.0.1:3128 http://www.google.com/

   Request sent to squid:
---------------
GET http://www.google.com/ HTTP/1.0
Host: www.google.com
User-Agent: curl
Accept: */*
Proxy-Authorization: Basic ***==
Connection: close

---------------

    squid at 127.0.0.1 contacts www.google.com,
    www.google.com sends the Reply to squid.
    squid sends it to curl

   Reply that comes back to curl:
---------------
HTTP/1.1 302 Moved Temporarily
Location: http://www.google.co.nz/
Cache-Control: private
Content-Type: text/html; charset=UTF-8
Set-Cookie: PREF=ID=***:FF=0:TM=1312894424:LM=1312894424:S=***;
expires=Thu, 08-Aug-2013 12:53:44 GMT; path=/; domain=.google.com
Date: Tue, 09 Aug 2011 12:53:44 GMT
Server: gws
Content-Length: 221
X-XSS-Protection: 1; mode=block
X-Cache: MISS from treenet.co.nz
X-Cache-Lookup: MISS from treenet.co.nz:8080
Via: 1.1 treenet.co.nz (squid/3.3.0.0)
Connection: close

---------------

    A normal web browser would follow that 302 redirect instruction
    and try again with second Request to squid ....

   Request sent to squid:
---------------
GET http://www.google.co.nz/ HTTP/1.0
Host: www.google.co.nz
User-Agent: curl
Accept: */*
Proxy-Authorization: Basic ***==
Connection: close

---------------

    squid at 127.0.0.1 contacts www.google.co.nz,
    www.google.co.nz sends the Reply to squid.
    squid sends it to curl:

   Reply that comes back to curl:
---------------
HTTP/1.1 200 OK
Date: Tue, 09 Aug 2011 13:01:27 GMT
Expires: -1
Cache-Control: private, max-age=0
Content-Type: text/html; charset=ISO-8859-1
Set-Cookie: PREF=ID=***:FF=0:TM=1312894887:LM=1312894887:S=***;
expires=Thu, 08-Aug-2013 13:01:27 GMT; path=/; domain
=.google.co.nz
Set-Cookie: NID=49=***; expires=Wed, 08-Feb-2012 13:01:27 GMT; path=/;
domain=.google.co.nz; HttpOnly
Server: gws
X-XSS-Protection: 1; mode=block
X-Cache: MISS from treenet.co.nz
X-Cache-Lookup: MISS from treenet.co.nz:8080
Via: 1.1 treenet.co.nz (squid/3.3.0.0)
Connection: close

<!doctype html><html><head><meta http-equiv="content-type"
content="text/html;
charset=ISO-8859-1"><title>Google</title><script>window.google={
---------------

Amos
If i send Obejct files in the directory, can i get result? how can i
simulate?
--mohsen

The above was a simulation I created by actually doing the command:
   curl --proxy 127.0.0.1:3128 http://www.google.com/


The object files are in a directory '/' on the machine www.google.co.nz accessed through a http:// service.

If you have an file transfer service (FTP) on the local machine a file called "name" might be available to Squid as the URL "ftp://localhost/name";

Amos



[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux