Squid has about 3+ MB foot print if I remember right.
So now we are talking about a specific size of machines that has 64bit
CPU units and most computers have 1Gbps Ethernet cards.
Now a days we are using squid for small cache or even larger cache but
In couple years or so(who knows how many) somebody will need this cache
piece of software as the same we are now using HDD in every day while
these HDD have 64MB of cache or a raid array card uses 512MB of cache etc..
Squid would probably will be one of the choices or at-least the source
for couple applications that will site on top of some cache in a Gbps or
even more..
I have seen in the past how refresh_pattern has huge effects on how
content is served and I was wondering to myself:
"As a user that operate a computer that has 100Gbps(total) network
speed, Will I use python?ruby?perl?go? based proxy or I will take the
old good Squid which surpass them all in terms of speed and other levels?"
Not comparing any language or anything but I am trying to merely trying
to find more cache proxy else then squid and varnish for http which are
opensource.
If you do have anything that looks and feels like my question I will be
happy to see some comments.
Thanks,
Eliezer