Search squid archive

Re: caching apt package lists/Raspbian

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



It turns out it still doesn't cache them the Packages.xz. From discussions over on the RaspberryPi forums it seems its hitting the following (this is just the Packages.xz) in order to match their main, contrib, non-free and rpi repos.

$ apt-get --print-uris update

'http://archive.raspberrypi.org/debian/dists/buster/main/binary-armhf/Packages.xz' 
archive.raspberrypi.org_debian_dists_buster_main_binary-armhf_Packages 0

'http://archive.raspberrypi.org/debian/dists/buster/main/binary-all/Packages.xz'
archive.raspberrypi.org_debian_dists_buster_main_binary-all_Packages 0 

'http://raspbian.raspberrypi.org/raspbian/dists/buster/main/binary-armhf/Packages.xz'
raspbian.raspberrypi.org_raspbian_dists_buster_main_binary-armhf_Packages 0 

'http://raspbian.raspberrypi.org/raspbian/dists/buster/main/binary-all/Packages.xz'
raspbian.raspberrypi.org_raspbian_dists_buster_main_binary-all_Packages 0

'http://raspbian.raspberrypi.org/raspbian/dists/buster/contrib/binary-armhf/Packages.xz'
raspbian.raspberrypi.org_raspbian_dists_buster_contrib_binary-armhf_Packages 0

'http://raspbian.raspberrypi.org/raspbian/dists/buster/contrib/binary-all/Packages.xz'
raspbian.raspberrypi.org_raspbian_dists_buster_contrib_binary-all_Packages 0

'http://raspbian.raspberrypi.org/raspbian/dists/buster/non-free/binary-armhf/Packages.xz'
raspbian.raspberrypi.org_raspbian_dists_buster_non-free_binary-armhf_Packages 0

'http://raspbian.raspberrypi.org/raspbian/dists/buster/non-free/binary-all/Packages.xz'
raspbian.raspberrypi.org_raspbian_dists_buster_non-free_binary-all_Packages 0

'http://raspbian.raspberrypi.org/raspbian/dists/buster/rpi/binary-armhf/Packages.xz' 
raspbian.raspberrypi.org_raspbian_dists_buster_rpi_binary-armhf_Packages 0 

'http://raspbian.raspberrypi.org/raspbian/dists/buster/rpi/binary-all/Packages.xz' 
raspbian.raspberrypi.org_raspbian_dists_buster_rpi_binary-all_Packages 0 

I thought I would paste these into redbot. A bunch of them are returning HTTP 404 - Not found responses. That first one for example gives a 404 response. Would that be enough to force squid to have to download the ones it can find each and every time? Any other suggestions on how to debug this?

MarkJ
_______________________________________________
squid-users mailing list
squid-users@xxxxxxxxxxxxxxxxxxxxx
http://lists.squid-cache.org/listinfo/squid-users




[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux