diff options
author | Mary Umoh <umohm12@gmail.com> | 2017-08-24 03:03:12 +0200 |
---|---|---|
committer | Dylan William Hardison <dylan@hardison.net> | 2017-08-24 03:03:12 +0200 |
commit | 1feabf50777a6f0f4193f467faad9f996e729367 (patch) | |
tree | 4dee23a909a568ff3cbd7782cb28dc50ddd97018 /template | |
parent | 344c9475c95d5c7a9801855e991ba18ed9587ec6 (diff) | |
download | bugzilla-1feabf50777a6f0f4193f467faad9f996e729367.tar.gz bugzilla-1feabf50777a6f0f4193f467faad9f996e729367.tar.xz |
Bug 1393145 - See if we can add a directive/rule to robots.txt to get search engines to crawl more slowly
Diffstat (limited to 'template')
-rw-r--r-- | template/en/default/robots.txt.tmpl | 1 |
1 files changed, 1 insertions, 0 deletions
diff --git a/template/en/default/robots.txt.tmpl b/template/en/default/robots.txt.tmpl index d8c1b5b86..c4948efe5 100644 --- a/template/en/default/robots.txt.tmpl +++ b/template/en/default/robots.txt.tmpl @@ -1,5 +1,6 @@ User-agent: * Disallow: / +Crawl-delay: 30 [% IF NOT urlbase.matches("bugzilla-dev") %] |