ok. Next thing... I notice that some queries hitting particularly hard are from spiders. Back in the puppet days, koji disallowed all spidering. I never have seen useful search results from it. So, I'd like to block all the spiders and see if that helps out load issues. robots.txt would have: User-agent: * Disallow: / +1s? kevin -- diff --git a/roles/koji_hub/files/kojiweb.conf b/roles/koji_hub/files/kojiweb.conf index 86abd2e..8334274 100644 --- a/roles/koji_hub/files/kojiweb.conf +++ b/roles/koji_hub/files/kojiweb.conf @@ -6,6 +6,8 @@ KeepAlive On Alias /koji "/usr/share/koji-web/scripts/wsgi_publisher.py" #(configuration goes in /etc/kojiweb/web.conf) +Alias /robots.txt /var/www/html/robots.txt + <Directory "/usr/share/koji-web/scripts/"> Options ExecCGI SetHandler wsgi-script diff --git a/roles/koji_hub/tasks/main.yml b/roles/koji_hub/tasks/main.yml index 0b6cd82..a5fc795 100644 --- a/roles/koji_hub/tasks/main.yml +++ b/roles/koji_hub/tasks/main.yml @@ -171,6 +171,13 @@ notify: restart httpd when: env != "staging" +- name: koji robots.txt config + copy: src=robots.txt dest=/var/www/html/robots.txt + tags: + - config + - koji_hub + notify: restart httpd + - name: kojira log dir file: dest=/var/log/kojira owner=root group=root mode=0750 state=directory tags:
Attachment:
pgpMMqFDscXOW.pgp
Description: OpenPGP digital signature
_______________________________________________ infrastructure mailing list infrastructure@xxxxxxxxxxxxxxxxxxxxxxx https://admin.fedoraproject.org/mailman/listinfo/infrastructure