Petr Baudis <pasky@xxxxxxx> writes: > On repo.or.cz (permanently I/O overloaded and hosting 1050 project + > forks), the projects list (the default gitweb page) can take more than > a minute to generate. This naive patch adds simple support for caching > the projects list data structure so that all the projects do not need > to get rescanned at every page access. Nice. BTW adding caching to gitweb is one of proposed ideas (projects) for Google Summer of Code 2006: http://git.or.cz/gitwiki/SoC2008Ideas > For clarity, projects scanning and @projects population is separated > to git_get_projects_details(). Perhaps this could be submitted as separate patch? I could do this if you are otherwise busy... [...] > + if ($cache_lifetime and -f $cache_file > + and stat($cache_file)->mtime + $cache_lifetime * 60 > time() > + and open (my $fd, $cache_file)) { > + $stale = time() - stat($cache_file)->mtime; > + my @dump = <$fd>; > + close $fd; > + # Hack zone start > + my $VAR1; > + eval join("\n", @dump); > + @projects = @$VAR1; > + # Hack zone end Why do you read line by line, only to join it, i.e. my @dump = <$fd>; ... join("\n", @dump); instead of slurping all file in one go: local $/ = undef; my $dump = <$fd>; ... $dump; Besides, why do you use Data::Dumper instead of Storable? Both are distributed with Perl; well, at least both are in perl-5.8.6-24. [...] > - git_project_list_body(\@list, $order); > + git_project_list_body(\@list, $order, undef, undef, undef, undef, $projlist_cache_lifetime); This is ugly. Why not use hash for "named parameters", as it is done in a few separate places in gitweb (search for '%opts')? -- Jakub Narebski Poland ShadeHawk on #git -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html