Search squid archive

Re: External C program

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Wed, Apr 29, 2009 at 1:22 AM, Amos Jeffries <squid3@xxxxxxxxxxxxx> wrote:
> Julien Philibin wrote:
>>
>> Hi John,
>> thanks for your reply.
>>
>> I'll give a shot with your skeleton and see how things are going on ...
>>
>> On Tue, Apr 28, 2009 at 1:59 AM, John Doe <jdmls@xxxxxxxxx> wrote:
>>>
>>> From: Julien Philibin <julien@xxxxxxxxxxx>
>>>>
>>>> Hi, I've been trying to find a typical external ACL C program skeleton
>>>> for a while, but I wasn't able to find anything very interesting ...
>>>> What I would like to do, is to read to different strings and process
>>>> them in order to allow/disallow access to a website.
>>>> The thing is, after a while I get two processes that use around 10 Mb
>>>> of memory and 15% of my CPU ....
>>>> Also, if I restart squid, I'll get two more processes running and so
>>>> on, everytime I restart squid ...
>>>
>>> Personaly, I use fgets/fflush and I did not see any problem (memory leak,
>>> etc) so far...
>>> Something like:
>>>
>>>  #define INPUTSIZE 4096
>
> FYI: I've just had to start bumping my own custom helpers to using 8196 or
> more for their buffers. Current Squid allow up to 8196 for URL length and
> many more for possible headers length so watch that on inputs.
>
>
>>>  char input[INPUTSIZE];
>>>  while (fgets(input, sizeof(input), stdin)) {
>>>   if ((cp=strchr(input, '\n')) == NULL) {
>>>     fprintf(stderr, "filter: input too big: %s\n", input);
>>>   } else {
>>>     *cp = '\0';
>>>   }
>>>   ...
>>>   fflush(stderr);
>>>   fflush(stdout);
>>>  }
>>>
>>> Do you use any malloc or functions that malloc... and that would need a
>>> free?
>>
>> Yes I do, but I also free them (the memory usage doesn't change). I
>> also made a mistake, it is not 10Mb but 1 ...
>>
>>
>> THe only weird thing is that after a restart (of squid), it looks like
>> squid doesn't have any control anymore on the externals programs and
>> they (both of external programs) start to use a lot of CPU...
>>
>> Maybe it has something to do with stdin that was not flushed correctly
>> and creates an infinite loop or something ...
>
> Probably. Squid simply closes its connection to the pipes and abandons the
> old helper. Leaving the pipe close with a '\0' I believe.
>  From the docs of scanf() I don't get a clear idea of the return value when
> empty string is received (is it 1/0/EOF?).
>

I'll try to figure it out as soon as my helper is working properly :-)

> Also scanf() you were using earlier has no concept of length and opens the
> possibility of buffer over-runs.
>
> Prefer fgets or snscanf() as input methods.
>

Hi guys, so, I've been trying to implement the source code you gave to
me. I am running into an issue.

my first string is supposed to be a source (lenght <= 16)
and the second one the URl of the website that the user is trying to access.

When I use the fgets method: fgets(source, sizeof(source), stdin) it
doesn't work. if the Ip address is less than 15, the program simply
takes the beginning of the destination URL and everything goes wrong
....

So I was wondering what would you guys use ?

sscanf(stdin, "%s", s);
or
scanf("%s", &source); //as I was doing before, and double check the
buffer's size
or
Something else?

I have to admit, all this is confusing me a little bit :-)
There must be an easy/secure way to catch two strings from stdin ...

Thanks for your time guys.

> Amos
> --
> Please be using
>  Current Stable Squid 2.7.STABLE6 or 3.0.STABLE14
>  Current Beta Squid 3.1.0.7
>

Julien


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux