On Sat, Oct 04, 2008 at 12:55:15PM -0400, Chris Nighswonger wrote: > On Tue, Sep 30, 2008 at 6:13 PM, Dave Dykstra <dwd@xxxxxxxx> wrote: > >> On Thu, Sep 25, 2008 at 02:04:09PM -0500, Dave Dykstra wrote: > >> > I am running squid on over a thousand computers that are filtering data > >> > coming out of one of the particle collision detectors on the Large > >> > Hadron Collider. > > A bit off-topic here, but I'm wondering if these squids are being used > in CERN's new computing grid? I noticed Fermi was helping out with > this. (http://devicedaily.com/misc/cern-launches-the-biggest-computing-grid-in-the-world.html) The particular squids I was talking about are not considered to be part of the grid, they're part of the "High-Level Trigger" filter farm that is installed at the location of the CMS detector. There are other squids that are considered to be part of the grid, however, at each of the locations around the world where CMS collision data is being analyzed. I own the piece of the software involved in moving detector alignment & calibration data from CERN out to all the processors at all the collaboration sites, which is needed to be able to understand the collision data. This data is on the order of 100MB but needs to get sent to all the analysis jobs (and some of it changes every day or so), unlike the collision data which is much larger but gets sent separately to individual processors. The software I own converts the data from a database to http where it is cached in squids and then converts the data from http to objects in memory. The home page is frontier.cern.ch. That article is misleading, by the way; the very nature of a computing grid is that it doesn't belong to a single organization, so it's not "CERN's new computing grid." It is a collaboration of many organizations; many different organizations provide the computing resources, and many different organizations provide the software that controls the grid and the software that runs on the grid. - Dave