Hello, I've made many test, but it seems not wanting to deliver from the cache. I think the objects are in the cache, I have modified the cache in memory object size. And now I can see the memory being filled up as I transfer / GET the files from SharePoint Online / Office 365. Do you think that any configuration change would work ? I was thinking about rewriting URLs upfront, before the Squid Cache proxy, in a chain configuration. But I am trying to avoid it for now. My Squid config: ------------------------------------------------------------------- acl allsrc src all http_access allow allsrc # http_port 3128 # cache_dir ufs /cygdrive/c/squidcache 100 16 256 # cache_mem 128 MB minimum_object_size 0 bytes maximum_object_size 50 MB maximum_object_size_in_memory 10 MB max_stale 1 month # coredump_dir /var/cache/squid # debug_options 11,2 # refresh_pattern -i \.(jpg|gif|png|txt|docx|xlsx|pdf) 30240 100% 43800 override-expire ignore-private ignore-reload store-stale refresh_pattern ^ftp: 1440 20% 10080 refresh_pattern ^gopher: 1440 0% 1440 refresh_pattern -i (/cgi-bin/|\?) 0 0% 0 refresh_pattern . 0 20% 4320 # https_port 10.10.10.10:443 accel ignore-cc defaultsite=tenant.sharepoint.com cert=/cygdrive/c/squidssl/sharepoint.com.crt key=/cygdrive/c/squidssl/sharepoint.com.key # cache_peer 13.107.6.151 parent 443 0 originserver login=PASSTHRU connection-auth=on ssl sslflags=DONT_VERIFY_PEER ------------------------------------------------------------------- Regards, Olivier MARCHETTA _______________________________________________ squid-users mailing list squid-users@xxxxxxxxxxxxxxxxxxxxx http://lists.squid-cache.org/listinfo/squid-users