Bob Talbot <snapper@st-abbs.fsnet.co.uk> writes: > > In a message dated 1/28/03 4:11:16 PM EST, nimbo@ukonline.co.uk writes: > > > > > When I had a free page I had over a million file requests a week > > That is more than one "File Request" per second. > > What *exactly* do you mean by the phrase? Oh, for once, this is quite clear. The http server answers file requests, and server statistics show the number of these. Popularly they're called "hits", and (perhaps typically) I find that the number of *page requests* (which I presume means requests for files with a MIME type of HTML/text or similar) is about 1/10 of that. FWIW, in the last whole week I had 70299 file requests, and 5706 page requests. (And I thought that was quite reasonable. It is about one server 'hit' every 10 seconds.) It doesn't seem impossible to have 10 times that, though rather extraordinary for this number to drop to zero when the thumbnail links stop working. Or perhaps I mean it does sound rather hard to get to a million without anything to *read*. > How do you measure it? > If your web page has 10 objects on it does one visit generate 10 such requests? Yes. > If someone presses refresh, is that onother 10? Yes. > How long do they stay? Huh? > Thinking: I can't believe there are 1000000 people per week actually looking at your site (well, possibly, but bear with me). I'm not actually convinced there are one million robots either ... No: my 73,467 hits over the last 7 days went to 2,614 distinct hosts. Perhaps a million hits might be around 20,000 surfers? Some 'distinct hosts' may be different AOL proxies, but the same AOL proxy may serve n people. Anything involving "Visitors" is (like "Length of Stay") mostly fiction. Brian Chandler ---------------- geo://Sano.Japan.Planet_3 Jigsaw puzzles from Japan at: http://imaginatorium.org/shop/