Petr Baudis <pasky@xxxxxxx> writes: > On repo.or.cz (permanently I/O overloaded and hosting 1050 project + > forks), It looks like repo.or.cz is overwhelmed by its success. I hope that now that there are other software hosting sites with git hosting (Savannah, GitHub, Gitorious,...) the number of projects wouldn't grow as rapidly. > the projects list (the default gitweb page) can take more than > a minute to generate. This naive patch adds simple support for caching > the projects list data structure so that all the projects do not need > to get rescanned at every page access. Another solution would be to divide projects list page into pages, perhaps adding search box for searching for a project (by name, by description and by owner). Nevertheless even with pagination, if we want to have "sort by last update" we do need caching. [...] > +# projects list cache for busy sites with many projects; > +# if you set this to non-zero, it will be used as the cached > +# index lifetime in minutes > +# the cached list version is stored in /tmp and can be tweaked > +# by other scripts running with the same uid as gitweb - use this > +# only at secure installations; only single gitweb project root per > +# system is supported! > +our $projlist_cache_lifetime = 0; [...] > +sub git_project_list_body { [...] > + my $cache_file = '/tmp/gitweb.index.cache'; > + use File::stat; > + > + my @projects; > + my $stale = 0; > + if ($cache_lifetime and -f $cache_file > + and stat($cache_file)->mtime + $cache_lifetime * 60 > time() > + and open (my $fd, $cache_file)) { > + $stale = time() - stat($cache_file)->mtime; > + my @dump = <$fd>; > + close $fd; > + # Hack zone start > + my $VAR1; > + eval join("\n", @dump); > + @projects = @$VAR1; > + # Hack zone end > + } else { > + if ($cache_lifetime and -f $cache_file) { > + # Postpone timeout by two minutes so that we get > + # enough time to do our job. > + my $time = time() - $cache_lifetime + 120; > + utime $time, $time, $cache_file; > + } > + @projects = git_get_projects_details($projlist, $check_forks); > + if ($cache_lifetime and open (my $fd, '>'.$cache_file)) { > + use Data::Dumper; > + print $fd Dumper(\@projects); > + close $fd; > + } > + } This could be much simplified with perl-cache (perl-Cache-Cache). Unfortunately this is non-standard module, not distributed (yet?) with Perl. Warning: not tested in gitweb! + use Cache::FileCache; + + my $cache; + my $projects; + + if ($cache_lifetime) { + $cache = new Cache::FileCache( + { namespace => 'gitweb', + default_expires_in => $cache_lifetime + }); + $projects = $cache->get('projects_list'); + } + if (!defined $projects) { + $projects = [ git_get_projects_details($projlist, $check_forks); ]; + $cache->set('projects_list', $projects) + if defined $cache; + } -- Jakub Narebski Poland ShadeHawk on #git -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html