Pete I have written a log analyser in VBA which runs in Excel Couldn't persuade Analog to do it for me with the results I want. My need was to focus on response time, so I've added %T to a custom log format as below LogFormat "%h %t %T \"%r\" %>s %b" performance My code identifies individual users as it scans the access log I use IP address to identify users. You may need to use your SSL string instead - in principle no difference. It would not be difficult to modify the code in the IPAddress collection to record what you are looking for and then spit this out on a spreadsheet on change of day or reaching access log EOF. I already produce for each page requested the following (which thereby intrinsically includes the count of D (#Hits) which you need): #Hits Min response, Median response, Mean response, Max response ditto for Bytes returned + the page ID requested There are filtering facilities to get rid of requests for GIFs, Java status requests etc ... it's all documented too :o) I scan up to 10,000 log lines/sec Perl would do it faster no doubt, but Windoze Excel is good enough for me Post a reply if you're interested further Graham -----Original Message----- From: pete@xxxxxxxxxxxxxx [mailto:pete@xxxxxxxxxxxxxx] Sent: 14 April 2005 10:04 To: users@xxxxxxxxxxxxxxxx Subject: [users@httpd] tracking user clicks thru a session id Hi, I've got an app where users login, go thru pages A to B, to C, and finish at D. I'm dumping out in each access log entry the users' SSL session ID to identify a respective user, e.g. 111.222.111.222 - - [13/Apr/2005:13:20:11 +0000] "POST /PAGEA HTTP/1.1" 70007 544 0BF4C6E30ACRA36FC6016B832697AEB1942B2154068C801ED3F9068450FE9B06 111.222.111.222 - - [13/Apr/2005:13:21:49 +0000] "POST /PAGEB HTTP/1.1" 70007 544 0BF4C6E30ACRA36FC6016B832697AEB1942B2154068C801ED3F9068450FE9B06 What I want to so is find out is how many users get to each point in the application, basically so I can report on people getting say to the checkout (C) but not actually purchasing (D), and the number purchasing (D). People can cycle thru certain pages, i.e. login - A - B - A - B - C - D so I don't really want to just do a "cat access.log | grep 'PAGEA' | wc -l" Before I get into loads of Perl, does any1 know of a web analysis tool / script that allows you to say track user clicks thru a series of predefined pages (A, B etc) and report on where people dropped out ? They all do lots of reports, but I can't find one that does this.... Any suggestions are appreciated. Thanks, Pete. ========================================================== This email was sent by Ethicalwebsites.co.uk. "Ethicalwebsites.co.uk - Internet Solutions for the UK" http://www.ethicalwebsites.co.uk/ --------------------------------------------------------------------- The official User-To-User support forum of the Apache HTTP Server Project. See <URL:http://httpd.apache.org/userslist.html> for more info. To unsubscribe, e-mail: users-unsubscribe@xxxxxxxxxxxxxxxx " from the digest: users-digest-unsubscribe@xxxxxxxxxxxxxxxx For additional commands, e-mail: users-help@xxxxxxxxxxxxxxxx --------------------------------------------------------------------- The official User-To-User support forum of the Apache HTTP Server Project. See <URL:http://httpd.apache.org/userslist.html> for more info. To unsubscribe, e-mail: users-unsubscribe@xxxxxxxxxxxxxxxx " from the digest: users-digest-unsubscribe@xxxxxxxxxxxxxxxx For additional commands, e-mail: users-help@xxxxxxxxxxxxxxxx