Search squid archive

"concurrency" attribute external_acl_type

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



List,
1) for the "concurrency" attribute does this simply indicate how many
items in a batch will be sent to the external helper?

1.1) assuming concurrency is set to "6" for example, and let's assume
a user's browser session sends out "7" actual URL's through the proxy
request - does this mean "6" will go to the first instance of the
external helper, and the "7th" will go to a second instance of the
helper?

1.1.1) Assuming the 6 from the first part of the batch return "OK" and
the 7th returns "ERR", will the user's browser session, render the 6
and not render the 7th?  More importantly, how does Squid know that
the two batches - one of 6, and one with 1, for the 7 total, know that
all 7 came from the same browser session?

What I have currently:
- openldap with postgresql, used for my "user database", which permits
me to use the "auth_param squid_ldap_auth" module to authenticate my
users with.
- a postgresql database storing my acl's for the given user database

Process:
Step1: user authenticates through squid_ldap_auth
Step2: the user requested URL(and obviously all images, content, ...)
get passed to the external helper
Step3: external helper checks those URL's against the database for the
specific user and then determines "OK" or "ERR"

Issue1:
How to have the user requested URL(and all images, content, ...) get
passed as a batch/bundle, to a single external helper instance, so I
can collectively determine "OK" or "ERR"

Any ideas?  Is the "concurrency" attribute to declare a maximum number
of "requests" that go to a single external helper instance?  So if I
set concurrency to 15, should I have the external helper read count++
while STDIN lines come in, until no more, then I know I have X number
in a batch/bundle?

Obviously there is no way to predetermine how many URL's/URI's will
need to be checked against the database, so if I set concurrency to
1024, "presuming to be high enough" that no single request will max it
out, then I can just count++ and when the external helper is done
counting STDIN readlines, I can process to determine "OK" or "ERR" for
that specific request?

Issue2:
I'd like to just have a single external helper instance start up, that
can fork() and deal with each URL/URI request, however, I'm not sure
Squid in its current incarnation passes enough information OR doesn't
permit specific enough passback (from the helper) information, to make
this happen.

Any deeper insights, would be tremendously appreciated.

Thanks,

-- 
Louis Gonzales
BSCS EMU 2003
HP Certified Professional
louis.gonzales@xxxxxxxxxxxxxx

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux