Hi, ~ I am in charge with a computer lab for a school. Connection to the Net is spotty (at best), because some other company if ofering it. Bandwidth is one of the issues, but they also created to a group account to access the Internet and that creates many other problems ~ Now the problem is that teachers need the labs with access to some pages online for a certain period of time and it has happened many times that when they come to the lab they can not access the pages and links they need to ~ I was thinking of: ~ 1) Asking teachers what pages they need to access ~ 2) getting these pages (including pictures, mp3 files, probably even youtube data feeds, etc) using a crawler/download manager, probably from my home to a local folder ~ 3) using a descriptor file or some format squid understands for all downloaded files, and ~ 4) transfer all the files to the box running squid as a transparent proxy server ~ so that students can access the pages ~ I think this is feasable and I think some other people haver been conditioned to do the same. Any links or ideas you would like to share with me? ~ Thanks lbrtchx