Search squid archive

Re: Jobs

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 29/06/11 01:37, Mohsen Pahlevanzadeh wrote:
On Tue, 2011-06-28 at 06:01 -0700, John Doe wrote:
From: Mohsen Pahlevanzadeh<mohsen@xxxxxxxxxxxxxxxxx>

We must write a program that along with normal tasks, it had do a
variety of jobs,But i need to PURGE and insert cache.
I must a write a web appl thet it manages squid with many extra job.
normal job: every work that squid can do.
Variety of job: a range of task that my boss ordered.

Amos question: What are these "normal tasks" and "variety of jobs"?
Your answer: extra job, normal job, a variety of job, a range of tasks...
Which does not answer the question at all...
Can you name the main tasks/jobs you need to do?
By example: start/stop/restart/reload squid, reset cache, purge/cache url?
Graph statistics, etc...
I believe that for most of these, you do not need to play with the squid code...

JD
1. PURGE from my program, but i can't call squidclient PURGE -m
"blahblah" from my code.

Sure you can. Several prefetchers just run exec("squidclient blah")


But now that it is clear you are building a whole management app, not just a prefetcher an HTTP library would probably be the better way to go. libcurl or whatever the equivalent is for your chosen language.

2.Insert into cache.

HTTP GET request. Tricky. Since you will have to figure out whether the clients will be asking for plain or compressed copies.

By far and away the best way to do this is simply not to bother doing it at all. Squid is designed to do the work of figuring out where objects are and how to get them to the client fastest.

Inserting objects into the cache may _seem_ to be a good idea. But HTTP is very complicated and there is a very good chance you will push the wrong variants of each object into the cache.

3.concurrent receive of site(minimum 100 sites)

If by "site" you mean website. Squid is used by ISP. They have accessible site numbers ranging in the high millions or billions. These are all concurrently available to an ISP situation, so safe bet on that requirement.

If by "site" you mean visitor. One Squid routinely handles hundreds or thousands of clients depending on your hardware specs. Or it may overload the network on _one_ client requesting TB sized objects.

You need to figure out a request/time-unit metric or a concurrent connections metric and test that is achievable with the desired configuration. The squid config file is a mix between simple on/off/value settings and a big script which tells Squid how to operate on a request. Seemingly simple changes can easily raise or lower the response speed by whole orders of magnitude.

4.permanent configuration must has e separate file.

permanent as opposed to what? randomly thrown in changes to the network layout? arbitrary changes to the access permissions? arbitrary changes to the stored cache objects?

5.Write a web app for manage squid.

You will need to define and clarify "manage" if you want any more help from use on that.

cache content purges, start, stop, restart, rotate logs are all triggered by asking Squid for certain cache manager URLs.
 statistical reports and network measurements are available at other URLs.

Run "squidclient mgr:menu" to see what is available to your app via these HTTP cache_object:// URL requests from the cache manager. The data reports are lists of current operating state. Some organized for humans some for machine processing. For graphing things over time SNMP requests are better to retrieve specific counter data.


At least this project specifies the above and minimum maximum site and
file and time.

time? That way lies danger. Beware the boss who says ALL object MUST be delivered within 10 seconds. For one day you are sure to get a client with 56K modem or satellite relay.

Cheers

Amos
--
Please be using
  Current Stable Squid 2.7.STABLE9 or 3.1.12
  Beta testers wanted for 3.2.0.9 and 3.1.12.3


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux