BigW Consortium Gitlab

Commit 6b0ea00d by Rémy Coutable

Merge branch 'correct_robots_txt' into 'master'

correct User-agent placement in robots.txt Closes #26807 See merge request !8623
parents 1cc6d206 8f427208
---
title: "Correct User-agent placement in robots.txt"
merge_request: 8623
author: Eric Sabelhaus
......@@ -4,13 +4,12 @@
# User-Agent: *
# Disallow: /
User-Agent: *
# Add a 1 second delay between successive requests to the same server, limits resources used by crawler
# Only some crawlers respect this setting, e.g. Googlebot does not
# Crawl-delay: 1
# Based on details in https://gitlab.com/gitlab-org/gitlab-ce/blob/master/config/routes.rb, https://gitlab.com/gitlab-org/gitlab-ce/blob/master/spec/routing, and using application
User-Agent: *
Disallow: /autocomplete/users
Disallow: /search
Disallow: /api
......@@ -23,12 +22,14 @@ Disallow: /groups/*/edit
Disallow: /users
# Global snippets
User-Agent: *
Disallow: /s/
Disallow: /snippets/new
Disallow: /snippets/*/edit
Disallow: /snippets/*/raw
# Project details
User-Agent: *
Disallow: /*/*.git
Disallow: /*/*/fork/new
Disallow: /*/*/repository/archive*
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment