Search squid archive

Re: limiting connections

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Thu, Apr 5, 2012 at 7:01 AM, H <hm@xxxxxxxxx> wrote:
> Carlos Manuel Trepeu Pupo wrote:
>> On Tue, Apr 3, 2012 at 6:35 PM, H <hm@xxxxxxxxx> wrote:
>>> Eliezer Croitoru wrote:
>>>> On 03/04/2012 18:30, Carlos Manuel Trepeu Pupo wrote:
>>>>> On Mon, Apr 2, 2012 at 6:43 PM, Amos Jeffries<squid3@xxxxxxxxxxxxx>
>>>>> wrote:
>>>>>> On 03.04.2012 02:21, Carlos Manuel Trepeu Pupo wrote:
>>>>>>>
>>>>>>> Thanks a looooooottttt !! That's what I'm missing, everything work
>>>>>>> fine now. So this script can use it cause it's already works.
>>>>>>>
>>>>>>> Now, I need to know if there is any way to consult the active request
>>>>>>> in squid that work faster that squidclient !!!!
>>>>>>>
>>>>>>
>>>>>> ACL types are pretty easy to add to the Squid code. I'm happy to
>>>>>> throw an
>>>>>> ACL patch your way for a few $$.
>>>>>>
>>>>>> Which comes back to me earlier still unanswered question about why
>>>>>> you want
>>>>>> to do this very, very strange thing?
>>>>>>
>>>>>> Amos
>>>>>>
>>>>>
>>>>>
>>>>> OK !! Here the complicate and strange explanation:
>>>>>
>>>>> Where I work we have 128 Kbps for the use of almost 80 PCs, a few of
>>>>> them use download accelerators and saturate the channel. I began to
>>>>> use the ACL maxconn but I have still a few problems. 60 of the clients
>>>>> are under an ISA server that I don't administrate, so I can't limit
>>>>> the maxconn to them like the others. Now with this ACL, everyone can
>>>>> download but with only one connection. that's the strange main idea.
>>>> what do you mean by only one connection?
>>>> if it's under one isa server then all of them share the same external IP.
>>>>
>>>
>>> Hi
>>>
>>> I am following this thread with mixed feelings of weirdness and
>>> admiration ...
>>>
>>> there are always two ways to reach a far point, it's left around or
>>> right around the world, depending on your position one of the ways is
>>> always the longer one. I can understand that some without hurry and
>>> money issues chose the longer one, perhaps also because of more chance
>>> for adventurous happenings, unknown and the unexpected
>>>
>>> so know I explained in a similar long way what I do not understand, why
>>> would you make such a complicated out of scope code, slow, certainly
>>> dangerous ... if at least it would be perl, but bash calling external
>>> prog and grepping, whow ... when you can solve it with a line of code ?
>>>
>>> this task would fit pf or ipfw much better, would be more elegant and
>>> zillions times faster and secure, not speaking about time investment,
>>> how much time you need to write 5/6 keywords of code?
>>>
>>> or is it for demonstration purpose, showing it as an alternative
>>> possibility?
>>>
>>
>> It's great read this. I just know BASH SHELL, but if you tell me that
>> I can make this safer and faster... Previously post I talk about
>> this!! That someone tell me if there is a better way of do that, I'm
>> newer !! Please, if you can guide me
>>
>
>
> who knows ...
>
> what is your purpose? solve bandwidth problems? Connection rate?
> Congestion? I believe that limiting to *one* download is not your real
> intention, because the browser could still open hundreds of regular
> pages and your download limit is nuked and was for nothing ...
>
> what is your operating system?
>

I pretend solve bandwidth problems. For the persons who uses download
manager or accelerators, just limit them to 1 connection. Otherwise I
tried to solve with delay_pool, the packet that I delivery to the
client was just like I configured, but with accelerators the upload
saturate the channel.

>
>
> --
> H
> +55 11 4249.2222
>


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux