Search squid archive

Re: Content Adaptation with HTTPs

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 





On 21/08/17 08:06, Christopher Ahrens wrote:
Amos Jeffries wrote:
On 20/08/17 16:05, Christopher Ahrens wrote:

The current solution doesn't work for me since it only supports a very
limited number of clients.  I am working with a charity that provides
internet services to those with impaired vision, the intention of my
project was to set up a semi-public proxy for recipient of the charity
(EG, we would install DD-WRT like routers within their homes that
would create a tunnel into our network so that they could browse the
internet using off-the-shelf systems.  We recently received a large
number of tablets form a corporate donor, the tablets themselves will
work for our recipients, but unfortunately the internet at large does
not.

FYI: If you can get the adaptation part to be small enough a non-caching
Squid should be able to run on those WRT-like devices with under 32 MB
of RAM needed. So the tunnel may not be necessary, just a way to update
the software and its config.

Part of it is to pre-shrink the size of the pages to prevent saturating the tunnel. A lot of our recipients have low-cost internet connections (Usually between 1-5 Mbps). From my personal experiences, the transformation are probably cutting about 75%-80% of excess garbage from website.

We're also looking at possibly building tiny x86 or ARM-based boxes that can be deployed to their homes to do caching to further reduce the load on their internet connections. The biggest complaint we have is why it takes so long to load pictures and words especially since a lot of the pictures are the same page-to-page (I am having a very hard time arguing with them...)

We can get a lot of hardware from local companies, but not so much in the way of software or services


You might be interested in the Store-ID feature then. Eliezer has done some nice experiments with using object hashes to further reduce the data transfer between a parent and child proxy when URL de-duplication is not quite enough by itself.


Amos
_______________________________________________
squid-users mailing list
squid-users@xxxxxxxxxxxxxxxxxxxxx
http://lists.squid-cache.org/listinfo/squid-users




[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux