Search squid archive

Re: External helper consumes too many DB connections

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hey Alex,

If there are a lot more requests than your users/TTLs should generate,
then you may be able to decrease db load by figuring out where the extra
requests are coming from.

actually, I don't think it matters much now that I think about it again, since as per my requirements,
I need to reload the cache every 60 seconds, which means that even if it is perfect, MariaDB will still 
get a high load. I think the second approach will be better suited.

and/or adding database
power (e.g., by introducing additional databases running on previously
unused hardware -- just like your MariaDB idea).

That is an excellent point,  I think I will work on a central connection aggregator as you suggested,
and also put in more DB power via Redis on K8S. This way it will best fast and scale automatically.

aggregating helper-db connections (helpers can be written to
talk through a central connection aggregator)

 That sounds like exactly what I am looking for, how would one go about doing this?

On Tue, Feb 8, 2022 at 4:41 PM Alex Rousskov <rousskov@xxxxxxxxxxxxxxxxxxxxxxx> wrote:
On 2/8/22 09:13, roee klinger wrote:

> I am running multiple instances of Squid in a K8S environment, each
> Squid instance has a helper that authenticates users based on their
> username and password, the scripts are written in Python.
>
> I have been facing an issue, that when under load, the helpers (even
> with 3600 sec TTL) swamp the MariaDB instance, causing it to reach 100%
> CPU, basically I believe because each helper opens up its own connection
> to MariaDB, which ends up as a lot of connections.
>
> My initial idea was to create a Redis DB next to each Squid instance and
> connect each Squid to its own dedicated Redis. I will sync Redis with
> MariaDB every minute, thus decreasing the connections count from a few
> 100s to just 1 every minute. This will also improve speeds since Redis
> is much faster than MariaDB.
>
> The problem is, however, that there will still be many connections from
> Squid to Redis, and I probably that will consume a lot of DB resources
> as well, which I don't actually know how to optimize, since it seems
> that Squid opens many processes, and there is no way to get them to talk
> to each other (expect TTL values, which seems not to help in my case,
> which I also don't understand why that is).
>
> What is the best practice to handle this? considering I have the
> following requirements:
>
>     1. Fast
>     2. Refresh data every minute
>     3. Consume as least amount of DB resources as possible

I would start from the beginning: Does the aggregate number of database
requests match your expectations? In other words, do you see lots of
database requests that should not be there given your user access
patterns and authentication TTLs? In yet other words, are there many
repeated authentication accesses that should have been authentication
cache hits?

If there are a lot more requests than your users/TTLs should generate,
then you may be able to decrease db load by figuring out where the extra
requests are coming from. For example, it is possible that your
authentication cache key includes some noise that renders caching
ineffective (e.g., see comments about key_extras in
squid.conf.documented). Or maybe you need a bigger authentication cache.

If the total stream of authentication requests during peak hours is
reasonable, with few unwarranted cache misses, then you can start
working on aggregating helper-db connections (helpers can be written to
talk through a central connection aggregator) and/or adding database
power (e.g., by introducing additional databases running on previously
unused hardware -- just like your MariaDB idea).


Cheers,

Alex.
_______________________________________________
squid-users mailing list
squid-users@xxxxxxxxxxxxxxxxxxxxx
http://lists.squid-cache.org/listinfo/squid-users

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux