summaryrefslogtreecommitdiffstats
path: root/template/en/default/robots.txt.tmpl
diff options
context:
space:
mode:
authorMary Umoh <umohm12@gmail.com>2017-08-24 03:03:12 +0200
committerDylan William Hardison <dylan@hardison.net>2017-08-24 03:03:12 +0200
commit1feabf50777a6f0f4193f467faad9f996e729367 (patch)
tree4dee23a909a568ff3cbd7782cb28dc50ddd97018 /template/en/default/robots.txt.tmpl
parent344c9475c95d5c7a9801855e991ba18ed9587ec6 (diff)
downloadbugzilla-1feabf50777a6f0f4193f467faad9f996e729367.tar.gz
bugzilla-1feabf50777a6f0f4193f467faad9f996e729367.tar.xz
Bug 1393145 - See if we can add a directive/rule to robots.txt to get search engines to crawl more slowly
Diffstat (limited to 'template/en/default/robots.txt.tmpl')
-rw-r--r--template/en/default/robots.txt.tmpl1
1 files changed, 1 insertions, 0 deletions
diff --git a/template/en/default/robots.txt.tmpl b/template/en/default/robots.txt.tmpl
index d8c1b5b86..c4948efe5 100644
--- a/template/en/default/robots.txt.tmpl
+++ b/template/en/default/robots.txt.tmpl
@@ -1,5 +1,6 @@
User-agent: *
Disallow: /
+Crawl-delay: 30
[% IF NOT urlbase.matches("bugzilla-dev") %]